Good day doctorSIM!
Actually there was a really great post up on Moz last month about this very thing- http://moz.com/blog/hreflang-behaviour-insights
Enjoy!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Good day doctorSIM!
Actually there was a really great post up on Moz last month about this very thing- http://moz.com/blog/hreflang-behaviour-insights
Enjoy!
Greetings Samantha,
Combining all your sites into one has several advantages, for example-
Having a main group site with individual locality pages within it is definitely doable from a local ranking perspective. We have a franchise client who has over 50 brick and mortar locations in 3 states and we are able to rank them locally with the same amount of effort you would put into a separate site (from an SEO perspective). You'll want to make sure, at a minimum, you do the following-
As for a recommendation, I would say do what fits best for you. It seems from what you're saying there are some financial benefits to merging and there are no SEO hurdles to prevent you from doing so. Good luck!
If you are using rel canonical then you can have the same on each page and it should be okay.
Otherwise, I would make sure your paginated pages don't have it. The next/prev helps Google to understand these are subsequent pages of the original category but it doesn't really give instruction as to the preferred page, etc. (like the canonical would) so you could end up with Google ignoring the content after it sees it too many times.
Yes it doesn't surprise me you'd be having problems on those higher page categories. Testing is always the way to go when in doubt. Out of curiosity, what e-commerce system are you using?
I personally prefer the slash but it doesn't make any difference as long as you're consistent and if as you say Google is already indexing most without, I'd probably go that way too!
Greetings Oren,
Good question. We use this tactic a lot with clients and on our own website but we don't necessarily include it on every page or post in an attempt to reduce blindness to the form. We do it quite often though. The key is to make sure you don't just thrown the form in there but actually call the reader to action in some way. ie: "For more information on (what the post topic is about), sign up for our monthly newsletter below:"
If definitely improves conversion rate (although I don't have any specific numbers for you) and with the right hook in the call to action is very effective in lead generation, and also classifying your leads into buckets of interest for different types of email campaigns (or whatever).
Greetings alrockn!
You do have quite the dilemma here. I actually think you will have problems if you leave it all as-is; you're between a rock and a hard place!
Most e-commerce programs do a terrible job on the technical SEO front out of the box and require some degree of customization to get it all straightened out. The pagination of category pages is a very common problem. I will take your word for it that you cannot modify your template(s) but any reasonable suggestion I think is going to require some degree of template modification.
The problem you're most likely going to run into is a thin content issue on your category pages. I'm assuming all of those paginated page versions would also have the same category description (if any) and if there is nothing unique about your main page Google is likely to ignore it.
To address your question on hard coding the first page as the canonical, I think that is really the only option you have. You'll want to make sure that category page does have some level of unique content on it (ie: category description text) so it is unique enough to attract Google's attention.
Could you not do some conditional coding to check the page version and modify the canonical accordingly?
Good day MozAddict!
SEO for Magento is near and dear to my heart. From a technical SEO perspective, I would recommend cleaning up the items you mentioned as it can cause issues. The biggest concern is trust flow and having trust split between two versions of the page (ie: the slash and no-slash).
So both the 301 and canonical tag will pass the same amount of trust as the other. So your question is, which do you go with? I think both are fine however I prefer the 301 myself for dealing with the trailing slash issue and here's why.
As time passes, believe it or not, people will link to some of your pages naturally. Because a canonical or 301 doesn't pass the full trust earned from the link, I'd rather someone link to the correct version. If I'm using the canonical tag, they may indeed link to the non-preferred version and I would lose some of that trust, whereas if I am using the 301, they will automatically be shown the correct, preferred version and I earn all the trust from that natural link.
Moz has a great article on canonicalization if you want to read more on it.
Hope this answer is useful to you!
It should work with posts, pages, custom post types, etc without needing that plugin.
Good luck!
I don't host my WordPress sites on IIS but according to their site, you have three choices for custom permalinks on IIS-
Microsoft IIS 7+ web server with the URL Rewrite 1.1+ module and PHP 5 running as FastCGI
Microsoft IIS 6+ using ASAPI_Rewrite (free for single-site server, $$ for multi-site server)
Microsoft IIS 6+ using Ionic ISAPI Rewriting Filter (IIRF) (free for single-site or multi-site server)
I got the above from this page.
I checked Moz's Open Site Explorer and Ahrefs which are both good sources of backlink data.
Structured data is a nice thing to have but wouldn't necessarily hurt your ranking. It can help Google more easily make sense of your content and also help you stand out a bit more in the SERPs if Google chooses to show a rich snippet result for you.
Getting rid of bad backlinks can be a manual task of reaching out and contacting webmasters and there are some tools that can make that process a little less time consuming. However, I didn't review your links for quality, just noting there were a lot of links from a small number of domains. The top one looked like the personal site of the company owner or broker.
Just a quick note, adding it to robots.txt instructs the crawlers not to crawl the URL, but it can still be indexed if it is being linked to from other places (and probably is if Moz crawler is finding it).
The easiest way to solve the problem if you don't want them indexed is to edit your search results page template and add a meta noindex to the .
I personally don't like to see search results surfaced to search engines (unless there's a strategic reason for doing so).
Greetings,
I don't know that it is related but recently we were claiming listings for a client with 50+ brick and mortar locations and after a month or two of having them claimed a handful of them went back into pending. I believe it is likely a bug or something. The easiest thing is to just re-verify them however if there is an issue with calls or sending a postcard you'll have to call their support line...and wait a very long time (relatively speaking for the web).
Greetings James! Welcome to the fun-filled and often onerous world of SEO.
Just taking a quick look at your site, you seem to have a lot of content pages, etc. but very little in the way of trust flowing to your site. For example, I can see you have thousands of backlinks but they're only spread out over forty or so unique domains.
As real estate is often local, you'd do well to try and rank in the local pack results for real estate related searches, however I notice a Google+ page that looks like your's but it doesn't have the same address as your website. You want that to match up and then also start building credibility to your Google+ page through positive reviews, etc. from users.
Real estate is a super-competitive niche and your best bet is to (at least until you have more trust) target more of the long tail of search. Those are just a few tips to get you started but anytime you're doing a competitive niche in a big city / region, it's not going to be a quick and easy task. Keep at it though; you'll get there!
Greetings! I think you're going to be better off creating an evergreen page that you can build trust to over time.
Because blog posts are off time-stamped Google may tend to ignore it as it gets stale. The exception would be if you were to, as Jimmy is suggesting, create a category page or something similar with unique content AND an aggregation of recent blog posts in that category. But you'd need both to keep it ranked over time.
Hello!
Is it just you have both www and non-www versions of all pages that are resolving? If so you can add one 301 redirect rule in IIS to redirect all of them from one to the other and solve the problem. If not, feel free to provide more detail and I or someone else can chime in.
EDIT
I just took a quick look and it looks like that's part of the problem. Follow the above and it should take care of it. I also noted the non-SSL version is 302 redirecting to the SSL version. That is an incorrect implementation. You want that to be a 301 so if someone links to the non-SSL version you get credit for that link juice.
Cheers!
I think it is possible to add 1000s of pages of unique content over time. 
Excluding (I assume that's what you mean) the layered nav pages in robots.txt won't necessarily keep Google from indexing those pages, only crawling them. So you end up with something like this in the SERP - http://screencast.com/t/SPKDV09SM9 (Note the meta description.)
If you didn't want them in the SERP you'd need to add a meta noindex to the of the page as well.
I don't really like splash/doorway/gateway pages like those from a usability perspective. If I'm going to drive traffic to a page on my client's e-commerce site, I want that page to have active product on it. I realize you can do some Magento hoodoo and get product on static pages but it's not worth the effort in my opinion. You're better off focusing on conversions from your layered nav pages with unique content.
I like #5 for the reasons you've stated. Also keywords in the URI string aren't as strong a ranking factor (in my opinion) as they used to be. My 2 cents.
Greetings Pamela!
This is nothing to worry about at all. UTF-8 is simply a type of character encoding and is set in the websites to instruct web browsers on how to interpret the character encoding. See- http://screencast.com/t/s4I2RNsgqUh
As it's not negative and perfectly normal, there's no need to change it at all.
Hello! Well this is one of the last messages you want to receive in regards to your website.
Google indicated they have applied a manual action to your site (Google Penguin is an algorithmic action, not manual). Within the email they gave you a basic set of marching orders on what you're going to have to do, although they don't make it seem as onerous as it really is. We have had quite a number of clients come to us with link-related problems and I will tell you it is a pain.
Your best bet for removing the bad links is to use an automated tool to help you identify the worst links. Some tools also include a way to gather contact info and keep track of link statuses for you, etc. which is convenient, particularly when you're looking at the number of links your site has.
To make sure you get a full list of links, consider downloading lists from several backlink providers (Open Site Explorer is one), de-dupe your list and use that as your master list. Any links you can't get removed you will want to add to a disavow list and upload to GWT (although my opinion is this doesn't really do anything to benefit you other than show Google you're trying).
When you file your reconsideration request, unless you've been extremely thorough you can expect them to reject it outright. I repeat, you have to get that link profile cleaned up!
Google has indicated in some instances, it may be better to start over with a different domain (not necessarily my opinion in your case, just making you aware).