Questions
-
Robots.txt Question for E-Commerce Sites
You're right on target, it's not a good idea to index search results. Google doesn't want to crawl or index other search results in its own search results. There are some exceptions for gigantico sites like Yelp or TripAdvisor when showing their search results pages are actually the best option, but if you're not at that level and especially if you're an ecommerce site, it's not recommended. You wouldn't be harming anything by excluding search from your robots.txt file. In fact, many top sites exclude search results to preserve crawl capacity and for indexation reasons. You'll also want to look at parameter handling in Search Console, this article from Google will get you started.
On-Page / Site Optimization | | Joe.Robison0 -
Any recommendations for an XML Sitemap for a large community website?
Did you try Yoast SEO Sitemap or All in one. Yoast works fine for me. Never had any issue.
On-Page / Site Optimization | | Verve-Innovation0 -
Search Analytics Question
The new Search Analytics is really a major step forwards, in my opinion. But it doesn't seem to have been tested very thoroughly. For instance, on any screen that shows total clicks, if you click the Download button and export that page to Excel, then add the clicks column in Excel, it's off by about 30%. So you may be looking at the effect of one of a number of basic bugs here.
Online Marketing Tools | | MichaelC-150220 -
XML Sitemaps for Property Website
Hi there I would also add here that you can look into a sitemap generator which will update and maintain your sitemap for you so that you don't have to manually do so. This can be quite tedious, especially if you are handling multiple listings and adding/removing them, so having a generator can hugely benefit you. This is a great time saver! Hope this alleviates a bit for you - good luck!
On-Page / Site Optimization | | PatrickDelehanty0 -
Can anyone see any issues with the canonical tags on this web site?
If the canonical link is not set, its possible you will get additional pages indexed for the same content especially if you are using a CMS. The best course of action would be to do a site search for your domain, and see how may pages are indexed. If you know you have 200 pages on your site, and you have 2000 indexed, you most likely have some duplicates showing up. We wrote an article about avoiding duplicates through htaccess earlier this month. Might be worth a read if you have not set any of those rules for your site. CMS's (Wordpress, Joomla) can be tricky to get duplicate pages under control. *edit Looking at your indexed pages, you are showing 122,000 links indexed for your site. If that isn't the case, you should look into assigning canonical links for all your posts, and if they are already indexed, processing redirects. You can also use Google's URL parameter tool to help, althought this tool should be used with caution if you don't know how to use it.
Intermediate & Advanced SEO | | David-Kley0 -
I need an XML sitemap expert for 5 minutes!
Few rules about sitemaps; You should only include in them pages you also want crawled and indexed They should not contain URLs with 404s or blocked by robots.txt My guess is there are too many URLs in the sitemaps, since I'd guess the website is not over 2 million actual "real" pages, Also, I randomly clicked on a URL in one of the sitemaps and it 404'd; http://www.eumom.ie/forums/topic/oakhill-school-leopardstown-/ This is probably causing a lot of the errors you see. It's honestly not a 5 minute fix - but if it were my site, I would be using the Yoast SEO plugin and using the sitemap feature within Yoast. It makes it very easy to include / exclude certain pages and updated automatically etc. I think there must be a way to tell your plugin what to include / exclude from the sitemap but I don't have as much experience with it. But generally - only include pages you want crawled and indexed. Don't include pages that 404.
Technical SEO Issues | | evolvingSEO0