Don't forget to exclude pages that don't contain the information you are looking for - exclude query parameters which just result in duplicate content, system files, etc. That may help to bring the amount down.
Posts made by McCannSEO
-
RE: What is the best tool to crawl a site with millions of pages?
-
RE: Anyone else noticed that Google Wemaster Tools data has not updated since Sep 24th - when 100% not provided was announced?
Thanks to everyone responding and letting me know that I am not alone!
Special thanks to Highland for spotting that post in the Product forums and letting us know that Google haven't taken ALL the keyword data away.It looks like it is slowly back filling the data - can now see from 25th so still not fully up to date but getting there.
Revolution postponed!

-
Anyone else noticed that Google Wemaster Tools data has not updated since Sep 24th - when 100% not provided was announced?
It's normal not to be able to see Google Webmaster Tools data (search queries, CTR, etc) for the past 2 days while it updates. However, I've given it time and I am not seeing any data pulling through since last Tuesday - coincidentally (?) when the (not provided) announcements were made. Is anyone else seeing this?
I have looked in different GWT accounts and both asynchronous and universal Google Analytics accounts too but I'm seeing the same thing... Surely Google hasn't taken search queries away from us? I find that hard to believe since they just gave us paid and organic reports in Adwords.
There's no info on the GWT blog to announce this...maybe it will update later, just think it's being a little slow if so!
-
RE: Best practice for multi-language site?
Yes that's a good point. So if you are just translating content but not targeting it to specific countries only, you can use href lang to specify the language, without specifying the country. E.g.
would specify French Canadian content but
would just state that it is for all French speakers.
In this case, you wouldn't need different top level domains to target each country, which is probably more than what you need!
Hope that helps.
-
RE: Transfering newly created targeted landing pages on an existing domain to a new domain
Just a thought - would a sub domain be a good idea here? So it would look something like; newbrand.testing.com as a sub domain to www.testing.com? This seems to be an alternative if the hosting is an option.
The blog would still be best placed to sit on the main domain, but any thoughts on this would be appreciated.
Thanks (sorry if I am just answering my own question here)
-
RE: Transfering newly created targeted landing pages on an existing domain to a new domain
Hi Jesse
Thanks for getting back to me.
The client is launching a new product/service that is different from their existing products, they still want to keep the old site active, but the new products and blog is what is being re-branded and marketed, not the entire site.
Their current site is hosted and built by a different agency and the new product is being marketed by us. We are waiting to see if we can install a blog on their existing domain and it is not hosted by us. The client has not requested a new site, more asked about one.
Originally, before I knew all the history, I suggested landing pages on the existing domain for the short term and then if they still wanted to go ahead with the re-branding and new site then 301 the whole domain. Question on this though, if we don't really need to do a re-direct ie company name not changing, just a re-brand I think it would be a good idea to leave the domain as it is, as it has authority and history. What are your thoughts on this.
I wanted to get feedback to see if anyone else has ever re-directed part of a domain, rather than a whole site as I didn't think it would be a good idea either.
Thanks again for the response
Tracey
-
Transfering newly created targeted landing pages on an existing domain to a new domain
Hi - Hope someone can help me with this please
I have a question regarding if its possible or advisable to create and host targeted landing pages and a blog on an existing domain, and then move these pages only to a brand new domain?
The existing site has good authority and is established. Due to tight timescales in delivery I suggested creating specific landing pages and installing a blog to build authority and trust over time to target completely new keywords. Also the new pages will be helped by the existing domain authority.
I've just found out client may want a whole new site, complete with new branding etc and completely new domain in time.
Has anyone experienced migrating specific pages and a blog across to a completely new domain and leaving the existing site as it was. I have a whole host of concerns over this, but the main one is that I will be building relationships and content to landing pages and the blog, aswell as linking out etc and then these URL's will have a re-direct on them, going to a completely new domain.
Also, the existing domain could lose any authority gained as although I wont only be targeting these pages, these will be the main ones being optimised and this will look unnatural. Do I?A./ Create blog and new landing pages on existing domain eg - www.testing.com/blog
www.testing.com/new-landing-pages, and then migrate these across to a brand new domain.or
B./ Create the new landing pages and blog and leave them on the existing domain - period? Concerns here;
Client wants to re-vamp and have a new style and these pages will not necessarily be supported by the existing site, there is no guarantee that we are even allowed to create new pages, let alone internal linking.or
C./ Bite the bullet and simply suggest a brand new domain to start with and explain the timescales and its either complete new domain or work on existing one.
If anybody else has any other ideas I would really appreciate them. The client is re-branding and the company who host the existing domain, might not want to support the new pages and blog.
I was hoping to provide a short term and long term solution as a brand new domain will take time to build up, especially as they are also brand new keywords we are targeting. However, I dont want the existing domain to be hit with any penalties or flag anything un-natural to Google.
Many thanks in advance for any advice..
- Tracey
-
RE: Best practice for multi-language site?
I'm not sure of the definitive answer to your question re. subfolders / subdirectories but have you considered using ccTLDs? As this is still the clearest way to tell Google what country you are targeting. Obviously there are logistical points to consider on this.
See what everyone else says but there are some great articles here:
http://www.seerinteractive.com/blog/international-seo-strategy-guide http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday http://www.seomoz.org/blog/international-seo-dropping-the-information-dust
Re. href lang, yes I think you should implement them if you are keeping the info all on the same domain but you don't have to do it on a page by page basis - you can make a sitemap. More info and free tool to generate them here:
http://www.themediaflow.com/2012/08/an-international-seo-implementation-tale-sitemaps-relalternate-hreflangx/
http://www.themediaflow.com/resources/tools/href-lang-tool/Hope that helps!
-
RE: Should I fix a high quality link when the website linking was complaining? What would you do?
That is a very good point, thanks for making it.
As it turns out, after having done a bit of digging, it seems that they have a few other links from this website which are not as negative, which relate to a competition they ran. However, I still think it's worth preserving the link and addressing the concerns. Links from DA sites don't grow on trees and surely it's worth doing a bit of reputation management?
I'll speak to the client and see what they want to do.
Thanks both!
-
RE: Should I fix a high quality link when the website linking was complaining? What would you do?
Thanks for the reply Dennis.
I can't really let it slip through my fingers. Awkward conversation with the client coming up!
If for any reason, they won't go for the idea of addressing the issue up front, do you think it would be harmful to 301 redirect the old link to the home page?
-
RE: Google: How to See URLs Blocked by Robots?
Okay, well the robots.txt will only be excluding robots from the folders and URLs specified and as I say, there's no way to download a list of all the URLs that Google is not indexing from webmaster tools.
If you have exact URLs in mind which you think might be getting excluded, you can test individual URLs in Google Webmaster Tools in:
Health > Blocked URLs > URLs Specify the URLs and user-agents to test against.
Beyond this, if you want to know if there are URLs that shouldn't be excluded in the folders you have specified, I would run a crawl of your website using SEOMoz' crawl test or Screaming Frog. Then sort the URLs alphabetically and make sure that all of the URLs in the folders you have excluded via robots.txt are ones that you want to exclude.
-
RE: Google: How to See URLs Blocked by Robots?
Hi Larry
Why do you want to find those URLs out for my understanding? Are you concerned that the robots.txt is blocking URLs it shouldn't be?
As for downloading a list of URLs which aren't indexed from Google Webmaster Tools, which is what I think you would really like, this isn't possible at the moment.
-
RE: Google: How to See URLs Blocked by Robots?
If you want to see if Google has indexed individual pages which are supposed to be excluded, you can check the URLs in your robots.txt using the site: command.
E.g. type the following into Google:
site:http://www.audiobooksonline.com/swish.cgi
site:http://www.audiobooksonline.com/reviews/review.php/new/
...continue for all the URLs in your robots.txtJust from searching on the last example above (site:http://www.audiobooksonline.com/reviews/review.php/new/) I can see that you have results indexed. This is probably because you added the robots.txt after it was already indexed.
To get rid of these results you need to take the culprit line out of the robots.txt, add the robots meta tag set to noindex to all pages you want removed, submit a URL removal request via webmaster tools, check it has been nonidexed then you can add the line back into the robots.txt.
This is the tag:
I hope that makes sense and is useful!
-
Should I fix a high quality link when the website linking was complaining? What would you do?
While reviewing 404 errors in Webmaster Tools, I noticed that a client had a link from a high authority, well respected forum, to a page which no longer exists.
When I checked out the linking post, it was from 2004 and showed a campaign against the company for it's advertising tactics. I'll spare the details but the company has since changed their ways.
It's tempting to implement a 301 to get the link juice from this DA 80 post, but since the reason for the link is a negative one and the co-citations are not going to be positive, is it better to just let this link go?
Or what about something more up-front, such as setting up a page which states the company's mission statement and commitment to quality and standards and 301 redirecting to there? Even if we let this link be broken, a potential customer could be put off, so it might be a good idea to address this past issue on site?
Let me know your opinions on whether there is a way to benefit from this link or whether we are better off allowing the 404.
-
RE: Link + noindex vs canonical--which is better?
Can I ask a question that leads on from this - how attractive a proposition is syndicated content it to publishers if you ask them to add a noindex / cross-domain canonical as well as a link from your article? Surely they want a chance to rank, expecially if they are planning on adding their own take and UGC, to differentiate it where possible, as Rand advises here: http://www.seomoz.org/blog/whiteboard-friday-leveraging-syndicated-content-effectively
Personally, content syndication is not something I would ever recommend for a client due to the complications from dupe content outweighing the benefits from links that could be earned...it just makes more work when that time could be spent on high quality guest blogging (in my view).
However, a new client is really interested in doing it. But if we offer content for those terms (link + noindex / cross domain canonical) - will there be any interest to use the syndicated articles at all?!
Maybe it would be better to offer the content in return for a link and a guarantee that they will either add unique content to it or canonicalize / noindex?