Questions
-
Webmaster Tools "Not found" errors after sitemap update
Hi Luke, Generally I wouldn't worry too much about that - the main thing is that you avoid users landing on these 404 pages which it sounds like you are. For reference, here is a helpful article from Google regarding their stance on 404s. Ideally, if there is a relevant page that you can 301 redirect to, you could do this with certain pages. But it sounds like this may not be possible as the URLs that Google has now crawled never existed in the first place. So I'd recommend that you 301 redirect to another relevant page if possible (but don't mass redirect them all to a single page or your home page) but if there are no relevant pages, leave them as 404s - it is unlikely that they will hurt your rankings. Cheers. Paddy
Intermediate & Advanced SEO | | Paddy_Moogan0 -
Duplicate keyphrases in page titles = penalty?
That's an interesting thought Luke. Yes, I agree something like that would work much better. I think a group like that would need some strong affiliations with already recognised online groups of like-minded SEO people (like on Moz) to give it gravity and value, but it could work. I don't know if such a group exists. Peter
Intermediate & Advanced SEO | | crackingmedia0 -
301 forwarding old urls to new urls - when should you update sitemap?
Hi Luke To include the suggestion on searchenginewatch.com in this conversation, it said: Submit an updated sitemap to Google Webmaster tools and use the change of address function if moving to a new domain. Remember to initially keep the old URLs in your XML sitemap to facilitate Google crawling those links and processing the changes in their index. Well it would be interesting to hear others feedback on that. Personally, I think having old URLs in a sitemap (that without a redirect would result in a page not found 404 error) doesn't seem correct to me. Presumably, you have had the URL in the sitemap previously when the page at the URL was active. But then, by setting up a 301 redirect, you are telling Google that the page at the URL that Google has in its index has now permanently moved to a new URL. When you submit a sitemap to Google then you are submitting a list of all the URLs on your site that you are asking Google to crawl. But to include the old URL in your sitemap along with the new URL is essentially asking Google to crawl two URLs pointing to the same page. I'm not sure Google would necessarily consider that to be a canonical issue (because the old URL is now not current) but for me it's a misuse of the sitemap. But as I say, it would be interesting to hear others feedback on this. Peter
Intermediate & Advanced SEO | | crackingmedia0 -
Problems with Moz tools
We are so sorry that you didn't get the deserved investigation with this issue. Please write in to help@moz.com and we will re-visit this issue quickly with you. The data should certainly not be as far off as you are seeing.
Other Research Tools | | Abe_Schmidt0 -
Exact match Title and H1 tags, and over optimization
I think that Google knows what (keywords) your articles are about - so there is less need to stuff them into a title tag. However, matching the keyword to what the searcher has in mind and will see bolded in the SERPs is still important. "providing variety within the SERP when compared to other results" Exactly. You need to stand out. Show that your content or product has special value, inspire the searcher to click on your page. I believe that title tags can move rankings if you can get the visitor to click and hold them after they land.
Intermediate & Advanced SEO | | EGOL0 -
Bad site migration - what to do!
Thanks for advice Paul - I've edited pages with poor backlink profiles out of redirect and 301'd the rest to relevant pages within the new website, and traffic has spiked up a bit. There were some very powerful links (national newspapers, etc) the new site was missing out on.
Intermediate & Advanced SEO | | McTaggart0 -
Capitals in URLs
I agree with Anthony, it's not a problem - the user would just copy and paste the URL in. I think this can cause an issue on Windows servers since they would return the same page for Double-Beds-Luxury and double-beds-luxury (like Amazon does). Since Google interprets characters literally, accidentally linking to the wrong page can make it appear like you have dupe content - Google usually figures this out but it's thought it can dilute your link juice to that page. But you've mentioned that the page 404s if you lower the case which is good, I guess the site's on a linux server. I don't think here's nothing to worry about at all with leaving the caps as they are.
Intermediate & Advanced SEO | | AngelDigital0 -
Offering discounts and getting backlinks - concerned.
Yes, agreed - will ask for nofollows on backlinks as not doing this for SEO benefit anyway.
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
Hi Kurt - very true - they should be taking the time for sure. I think part of problem is legacy of duplicate content - glad I'm not in their shoes! Yup - rewriting is what I'm doing for those guys - inc new ideas for engaging content. Will let you know how it goes - an interesting project for me as never worked with a directory before!
Intermediate & Advanced SEO | | McTaggart0 -
I have two sitemaps which partly duplicate - one is blocked by robots.txt but can't figure out why!
There are standards for the sitemaps .txt and .xml sitemaps, where there are no standards for html varieties. Neither guarantees the listed pages will be crawled, though. HTML has some advantage of potentially passing pagerank, where .txt and .xml varieties don't. These days, xml sitemaps may be more common than .txt sitemaps but both perform the same function.
Intermediate & Advanced SEO | | Chris.Menke0 -
Just identified and reversed a severe footer links penalty - any similar experiences out there?
It's all been corrected now. In link above, it gives you an exact example of how travel sites have been typically using footer links, and how they have been hit. I'd say my client was a carbon copy. A travel site with keyword rich footers - a few of which pointed externally - most of which pointed at internal pages.
Intermediate & Advanced SEO | | McTaggart0 -
Changing title tags in well established site - should I do this gradually to avoid risk of penalty?
Luke if you're writing these manually I doubt you could possibly write and publish them fast enough for it to be any problem at all. If this is automated and you're going to be releasing thousands, or tens-of-thousands... of new title tags I'd slow down and do it in segments, not for fear of any "penalty" but because it's always good to test things out before making changes that big unless you know for sure it is a change for the better - which it sounds like it is.
Intermediate & Advanced SEO | | Everett0 -
Optimized site-wide internal links in footer - a problem?
Thanks Michael - you're dead right in your approach there - amazed how many have got it so wrong by writing for Googlebot and not the actual site users. Found this interesting re: internal links - plenty of discussion on the issue but definitely a lack of clarity: http://www.seroundtable.com/google-internal-links-anchor-text-16864.html
Search Engine Trends | | McTaggart0 -
The risk of semi-hidden text, which only shows-up when page viewer clicks button.
Check out this video: http://www.youtube.com/watch?v=EsW8E4dOtRY Summary: If you're not trying to stuff hidden text in there then don't worry about it, it's a normal thing on today's web.
Search Engine Trends | | Schwaab0