A developer who tells you "W3C validation isn't important" is like a house builder telling you "Those small cracks in the walls are nothing to worry about"
George
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
A developer who tells you "W3C validation isn't important" is like a house builder telling you "Those small cracks in the walls are nothing to worry about"
George
Google has a policy for this - what you're doing is not advisable - you should be annotating the URLs. You can read the correct approach to take here: https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/separate-urls
Hi,
You're far from being alone with the issues you described, but personally I wouldn't recommend what you're suggesting:
If I was you I'd disavow the spam links per Google's policy (https://support.google.com/webmasters/answer/2648487?hl=en), set up 301s to your new URLs and following a bit of patience, start your SEO afresh with a clean slate.
George
@methodicalweb
Hi,
I see a couple of assumptions in your question - I would say that having a "keyword rich domain" is becoming a less significant ranking factor in SERPs so I wouldn't base the migration of an existing website that performs pretty well on the potential of a new domain targetting certain KWs.
Secondl assumption is that your existing domain is ranking purely because it's older. There are likely to be other factors at play here - particularly backlinks.
However, I realise that you need to restructure the website and moving to a single domain with the complexes on subdirectories makes sense architecturally. You might well see a drop in rankings certainly in the meantime while you do this migration so if this is a key acquisiton channel, then investigate PPC options to bolster your traffic.
As for the 301 - I agree it makes sense to 301 to the complex subdirectory for a user, however in Webmaster Tools Google doesn't support the migration of one domain to the subdirectory of another domain. This means it won't be as seamless as if you migrate to the root of the new domain.
One way around this would be to redirect the old domain to the root domain, but provide very clear navigation on how to get to the relevant apartment complex to a user. As far as a user is concerned, I would see this as an acceptable solution.
George
Your site appears to be indexed OK, but your visibility is low. I checked that "money site" is a low competition keyword you should be ranking better for.
Taking a look at your backlink profile (opensiteexplorer.org), it appears that there are a ton of toxic links pointing to the domain. This is almost certainly going to affect your rankings through Google Penguin, unless someone's already gone through a stringent disavow process.
Before you launched a new site on this domain, was it vetted to see if your predecessors had done any link building badness?
George
Hi,
I've been badly burnt by agencies in the past offering "quality" link building services and have done quite a lot of work on dealing with a conundrum similar to yours. Here is my advice:
Good luck,
George
Personally I wouldn't rely just on robots.txt, as one accidental, public link to any of the pages (easier than you may think!) will result in Google indexing that subdomain page (it just won't be followed). This means that the page can get "stuck" in Google's index and to resolve it you would need to remove it using WMT (instructions here). If there were a lot of pages accidentally indexed, you would need to remove the robots.txt restriction so Google can crawl it, and put a noindex/nofollow tags on the page so Google drops it from its index.
To cut a long story short, I would do both Steps 1 and 2 outlined by Federico if you want to sleep easy at night :).
George
Hi Rich,
I can't imagine that Google would penalise your parent site because branch site domains 301 to it in this way. I gather from Matt Cutts that a significantly long chain of 301s (301->301->301->301 etc) might be frowned upon but that's not what you're proposing.
I'm making an assumption that your goal is to publicise the "friendly" .com domains on business cards / advertising so as users don't have to type in a long URL. You will most likely get links, but the 301 will pass on at least some of the value to the parent site. The only thing to consider perhaps is whether you plan to have other pages on the friendly domains which also need to 301, (e.g. www.DentalCareofLacey.com/contact)) in which case there will be an overhead in maintaining these.
You'll also want the parent site landing pages to be SEO optimised for their respective regional areas, but as you've already got the region in the URL I think you're probably on top of this.
George
Hi Tanveer,
It's hard to answer your questions without seeing the raw data. I presume these are external rather than internal links, and that they are genuinely new as opposed to only just having been discovered. I would start with going to Webmaster Tools, downloading your latest links and having a look at where they are coming from.
There could be a number of reasons for this, and so there's no point me speculating and you're right to investigate further. Using a link profile checker such as cognitiveseo.com will give you a clearer idea on the quality of any new links you acquire.
Feel free to post more information if you need,
Regards
George
You're in luck because Matt Cutts covered at least part of this question quite recently, which you can read about/watch here: http://searchenginewatch.com/article/2308339/Matt-Cutts-Create-Unique-Meta-Descriptions-for-Your-Most-Important-Pages.
In short - you should hand craft the meta descriptions for your most valuable pages (i.e. the pages you want to rank high in SERPs) but it wouldn't be expected for every meta description on your site due to the amount of work involved.
Personally I think the variance between these auto generated descriptions is still too low and would look for other words to vary them by - for example the type of cruise, the savings, or activities offered on the cruise in each region.
You'll also want to bear in mind a similar problem you're likely to experience with the page Title, Headings and content.
George
It looks like this error is caused by a plugin you have installed and enabled on your wordpress site that probably isn't compatible with the version of wordpress you're running. If you disable the Backlinker plugin it will probably go away.
As for SEO impact - it appears to also have mangled your /robots.txt (which you should fix), and the user experience of seeing this error is poor and so it's worth fixing.
George
Hi Lee,
The foundation site idea sounds like a real roundabout way of achieving organic traffic and hence sales - which from a high level I'm assuming is what you're trying to achieve. It would perhaps make more sense if you were going to use the Foundation site to drive referrals, or to use for PR, rather than solely for link equity purposes.
It wouldn't take much for Google to work out that the foundation site is a bit of a cynical attempt to gain rankings.
If I was you I'd focus on improving the content and linkability of your client's existing site and address some of the branding issues head on rather than side-stepping them with a sister website. You can incorporate the "foundation" idea into the existing website (perhaps on a subdomain or directory), which if done properly - with valuable content - will earn natural links and therefore gain far more organic value than having a sister website.
George
Hi,
I understand the question as: In the SSL (HTTPS) version of my homepage, should I add a rel=canonical link to the markup which points to the non SSL version of my homepage?
If your SSL pages are only accessible to authenticated users (i.e. not crawlers) then I can't see that it would make much difference as you won't suffer from duplicate content. However, if your SSL page is accessible to crawlers (as is becoming more common recently) then adding the canonical tag to non SSL is a good idea. In addition to preventing duplicate content issues, there's a good chance that your SSL page might get linked to, and blocking crawlers to it (using noindex / robots etc) means you won't get the benefits of those links.
One thing to bear in mind first is that you should decide on whether the single canonical version for your site is your HTTP or HTTPS pages. Then canonicalise accordingly.
George
Hi,
This is quite hard to diagnose without seeing the actual page content but I can give you some pointers:
1. It sounds like the textual variance between the core gallery page and the individual category pages is too low. If you want to rank for the individual category gallery pages then consider writing a different paragraph on each page (100-200 words) to vary them as well as varying the title, description, headings and anything else that you can. Google will then index all of the pages and they won't have duplicate content.
2. If you don't need to rank for the category pages, and want to keep the content the same (apart from the images), then consider using a rel=canonical from the category pages back to the core gallery page. Google will then only index the core gallery page and you don't need to worry about the content being duplicated. Moz should honour the use of rel=canonical and not report duplicate content any more.
George
@methodicalweb
Hi,
It was a very bold move to drop such a significant number of pages from your site, especially if they were built up over time and attracted links. Even if the content wasn't completely original, that's not to say it didn't have some value. I think if I had made such a major change to a website and saw rankings drop, I would probably have reversed the change but then it's not clear whether that's an available option. Since I don't know the full reasoning behind the decision I'll reserve any further judgement and try to answer your question.
Returning 404s is the "right" thing to do as those pages don't exist any more, though putting 301s to very similar content is preferable to keep the benefit of any backlinks. I sense there weren't many links to worry about though as you're not very positive about the content which was deleted!
Google will hold onto pages which return 404s for some time before removing them from its index. This is to be expected as web pages can break/disappear unintentionally and so you have a grace period to "fix" any issues before losing your traffic.
The fact that Moz isn't showing any 404s shows that you aren't linking to the deleted pages because they are not being picked up by the crawl. They will drop out of WMT in a few weeks where you haven't inserted 301s to existing pages. You should also double check that they've been removed from the sitemap you submitted to Google.
Hope that helps,
George
@methodicalweb
The fact you already know the toxic links and have link building experience suggests you're more than equipped to do it yourself. As already suggested, the best way is to use Google Merchant Centre Disavow - and this is exactly what a link removal company would do.
I would consider a month trial/paid of Open Site Explorer / Majestic to get all the links, though Webmaster tools also provides a free sample download of links too.
I'm sure it goes without saying but just make doubly sure that these links are toxic before you disavow them. If a link removal company slipped up and did this then there's a risk of them causing harm so there are advantages to you doing it yourself if you know what you're doing.
Personally I would include the company/brand name. If it isn't a blue chip then it does no harm to re-enforce your brand on SERPs, and if it is a blue chip then you potentially stand to increase click-through because of increased trust/recognition.
George
Hi Justin,
Personally I think you'll be fine as you've described the initiative working. Google doesn't expect every link to a website to be from a high authority, otherwise it would look unnatural. In reality, there will be a mix of high and low authority pages/domains that link to every website. However, if the blog posts are being spun out on blog networks, or if your customers sites have been penalised by Google then it probably isn't going to help you much.
What isn't clear is whether you're effectively buying these links, and whether they will pass PageRank or not. I encourage you to read Google's guidelines on this: https://support.google.com/webmasters/answer/66356?hl=en.
George
Hi Jarrett,
Although the menus probably look different in your designs (an assumption on my part), the HTML looks identical on the link you provided (ULs/LIs). If the HTML is the same, then you'll use CSS to vary the appearance of them - specificially using the viewport on responsive mobile which is designed for exactly this scenario.
Perhaps I'm missing some other dev reason why it can't be done, but using ajax for this, even if you do attempt to block Google crawling it sounds like an over-engineered solution.
George
Hi Finnmoto,
You're in luck - it does use a 301 for that homepage redirect. The results of this test were brought to you by the mighty Fiddler (http://www.telerik.com/fiddler).
I've migrated pages like this before and it can take a bit of time for the dust to settle. Remember you've migrated an entire website to a new subdomain in one go and that takes time for Google and other services to process (depending on how authoritative your site is).
It's worth crawling your entire site's old URL structure with ScreamingFrog to check the redirects were implemented correctly.
Regards,
George