Regarding pagination - urls look fine and you should use rel=prev/rel=next instead of the no-index tag.
Regarding sorting - Google have a handy little sheet about this which you may or may not have seen that covers this kind of issue
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Regarding pagination - urls look fine and you should use rel=prev/rel=next instead of the no-index tag.
Regarding sorting - Google have a handy little sheet about this which you may or may not have seen that covers this kind of issue
I was afraid that this might be the case.
Thanks for the help.
1. They cheat using automated methods of link building.
2. It's a loophole in Google's algorithm.
3. If it's as bad as you say then I would imagine they will be penalized for it however you can always give them a little help by filing a spam report
I keep getting the "Googlebot found an extremely high number of URLs on your site" message in the GWMT for one of the sites that I manage.
The error is as below-
Googlebot encountered problems while crawling your site.
Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site.
I understand the nature of the message - the site uses a faceted navigation and is genuinely generating a lot of duplicate pages. However in order to stop this from becoming an issue we do the following;
But we still get the error and a lot of the example pages that Google suggests are affected by the issue are actually pages with the no-index tag.
So my question is how do I address this problem?
I'm thinking that as it's a crawling issue the solution might involve the no-follow meta tag.
any suggestions appreciated.
First off I'd check that it's not an issue with your rank tracking software.
Beyond that I have seen some crazy fluctuations on Bing and Yahoo but I haven't dug into them yet - I'm based in the UK so Google's got a 90%+ market share.
Nope.
A 301 is something you have to actively implement.
You can change your title tags without it causing any technical issues UNLESS your CMS automatically rewrites URLs based on the title of the page. If this is is the case then you'll need to find a way to disable this in order to change your title tags.
I agree with Ryan.
No. It's not going to hurt you.
Unless there are other factors limiting you I would probably bring all of the variables onto a single page and treat size as a variable similar to the way you treat colours.
So you don't currently have separate pages for Burgundy and Caribbean Blue covers so why not do the same for size? You would want to adjust the copy to indicate that the product is available in different sizes and obviously this would require reworking the page but if you're going to create a parent page anyway then you might as well.
Hope that helps.
Glad to hear it's all working for you.
Do you think the directories had a noticeable impact. I've discontinued by BOTW subscription because I didn't see the value in it anymore but would be interested in hearing your experience.
It's more of a copy issue as you mentioned. So long as the content on the pages is unique you shouldn't have a problem.
There is an argument that linking out might increase your rankings by a tiny bit.
So long as it's a decent, none malware hosting, none spammy site it won't have an adverse effect.
With regard to follow/no-follow - if you trust the site and think it's passing a benefit to users then why put the no-follow on it?
I've got a couple of recommendations.
First off you can demote a site link in Google Webmaster Tools. You can find the option in configuration/sitelinks. Although Google is obviously assuming that the pages with current site links are important somehow so it might be a good time to recheck your site architecture.
Secondly you should probably remove the no-follow attribute from internal links on your homepage. They won't be having a beneficial effect and could negatively impact how search engines view certain pages on your site.
Hope that helps.
Really?
I'd have gone for www.domainname.com/category/subcategory
Have you done any testing on this? Would be really interested in hearing your results.
You can use pagination markup that lets Google know it's one long list.
There's a whole video, from Google, about it here - http://googlewebmastercentral.blogspot.co.uk/2012/03/video-about-pagination-with-relnext-and.html
It's worth watching.
I don't know which other search engines support this markup.
If you're after redirecting mysite.com/page?id?=xyz to mysite.com/page the you can use RegEx in your .htaccess file to automatically does it for any variation of ID and TYPE.
I'm not particularly technical so that's all I can offer but I'd love to see what other responses you get back.
You won't reduce your relevancy however you will have to do some work with your site architecture.
At the moment your top nav is pretty extensive and adding extra links to new product lines might damage user experience. So you'll need to remove some of your current links or rehaul your design in order to add new categories.
Continuing on the same domain is definitely the right thing to do however as your new content will benefit from the link equity you have already built up.
I don't numbers as a % of the total market but after having a quick look at a large sample on an e-commerce site I found around 1 wildcard per 70,000 searchers. Of those only 1 in 20 actually look like they're deliberately using it as a wildcard.
Hi Steve,
I think it's possible for you to do both. Conduct your research as if you want to rank for both [IT support] and [Denver IT support] and group those terms onto the same page.
Then optimize the page to put emphasis on the generic term. So in this example your title tag would be something like "It Support in Denver - Steve Sequenzia". That way you're hitting both possible avenues.
My only concern with this would be CTR from the SERPs and whether people who type in [IT support] really want a local firm or a local branch of a national firm. I'd probably run a couple of short tests using geo-targeted PPC to see which types of ads get the most interest.
Hope that helps.
What Gamer07 said.
PPC doesn't influence organic rankings but that doesn't mean you shouldn't bid on a competitor's brand name.
I don't think it will "weaken" the domain but if it might provide a better experience for users if instead of clicking a link and being 301d they could click a link straight through to the target page.
You can 301 the duplicate pages as well if you like.