Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Whats the best way to deal with 4xx erros?
I just took a look at your site right now and it looks like you resolved that index.html issue. Nice work! Mike
| Mike.Goracke0 -
Remove html file extension and 301 redirects
This looks good to me, the html pages are 301ing to the non .html versions.
| Tom-Anthony1 -
Why my site is not indexing in google
Our site content are unique ...Today i checked GW now 750 links got indexed out of 1300 links..I Couldnt understand the google crawl time & concept..We have waited for 1 month..it crawled only 5300 out of 22000 links....and got 404 error -469 & Not found - 57 errors, day by day the errors getting increase ..after that we redesigned our site model(user friendly) & changed few internal links and created new sitemap & resubmitted in GW last saturday ..Next day it crawled 4 links only ...after 5 days it crawled (today) 750 links.. The error links are changed in new sitemap ..so when will google crawl links completely & clear all errors..
| Rajesh.Chandran0 -
Will an XML sitemap override a robots.txt
The robots file will avoid google to show further information on the disallowed pages but it doesn't prevent indexation. They're still indexed (that's why you're seeing them) but with no meta desc nor text taken from the page because google wasn't allowed to retrieve more information. If you want them to start showing info, you'll jsut need to remove that rule from the robots.txt and soon you'll start seeing those pages information showing, but if you want them out of the index you can use GWT to remove them from the index after you've included in each page the noindex meta tag which is the only command which will prevent indexation.
| mememax0 -
Author & Video Markup on the Same Page
Yes. Video-blog kinda thing or "vlog" or whatever it's called by the cool kids these days. Whiteboard Friday kinda thing - totally appropriate. I can't really think of any other use-case right now...Maybe interviews if the authorship is pointing at the interviewer?
| PhilNottingham0 -
Solutions for too many on-page links?
Thanks Takeshi, you make some valid points. Makes sense to me.
| unikey0 -
Schema.org for restaurant menus
Hi Adam, I know this is an old thread but if you get this you might find it useful. I just checked a particular brewery site; they have their menu marked up with the url and it seems to be picked up by the rich snippets tool OK, although it does throw a few errors.I tested it on a few of my other tools and it definitely gets picked up without a problem. Here is the GWT link
| Ontarioseo0 -
Can view pages of site, but Google & SEOmoz return 404
You might consider opening a help ticket where we can take a private look at the URLs and figure out why Roger isn't happy. We're a little more backed up than usual at the moment in help, so it may be a day or so before someone can get back to you, but it's something to try.
| KeriMorgret0 -
Google sees 2 home pages while I only have 1
don't use canonical this is not what it is for, use a 301, redirect, and make sure that all your links in your site point to domain.com. when people link to your site, you want them to always link to the same url, using a canonical will not hide the index.htm, and there for you will get links and book marks to the wrong url
| AlanMosley0 -
Importance of WMT change of address and problem doing it
Yes that is one way. Here is a link to a bunch of methods to verify your site https://sites.google.com/site/webmasterhelpforum/en/verification-specifics If you have access to the Registrar (or know who does) then it's much easier to verify that way.
| DarinPirkey0 -
Best way to implement noindex tags on archived blogs
Hi there I definitely admire your creativity on this one, but unfortunately the Google Tag Manager loads tags asynchronously. That means that the tags are added for users who execute javascript on page loads, but it means that the copy of the page that is crawled does not contain the tags. All of this means that the Google crawler won't see a noindex tag on the crawled version of its page if it's loaded via the Google Tag Manager. I think your reasoning for noindexing the pages themselves is a very good one. You're removing pages with thin or potentially duplicate content from Google's indexing, which is healthy, while keeping the content on the page for the user, which again is a healthy thing to do. I can definitely see the reasoning. Unfortunately, the only way I can see the tags being implemented is manually by the webmaster. If the site runs wordpress, you can change the robots meta data very quickly in the Yoast SEO plugin. If you get to the page in question and scroll to the Yoast plugin section, you'll be able to select the noindex tag from a drop-down menu, meaning it can take as little as 30 seconds. Hope this helps and good luck with the implementation.
| TomRayner0 -
Suddenly Many 301 Redirects captured by SEOMOZ
I found the problem it was my sort function
| noerdar0 -
How should i knows google to indexed my new pages ?
Fetch as Google tool is not the same as the submit URL tool that I have mentioned. In the pages that are not indexed yet, there might be some accessibility issues. In such a case, you will be able to see the URLs in Issues and Warnings report from Webmaster. If your site is a WordPress site, you can install a plugin - BWP GXS. This plugin automatically sets the ping frequency for each section. It helps in getting the site indexed quickly. Another solution is to share your site's URL on social media platforms or update the site with fresh information daily. Once, Google bots identifies a site as a static site, it will take sometime to increase crawl and indexing frequency. So, make sure that you keep updating content on the site for next few days on a regular basis.
| mayanksaxena1 -
Does 301 redirect of old filenames still work?
thanks. this is helpful. i did change twice since 2004 because of the redesign and more recently, the change in platform (changed to wordpress from static)
| optimind0 -
Which Pagination/Canonicalization Page Selection Approach Should be Used?
Hello Oxfordcomma, If you have fast page load times on the view all pages you can make those canonical. This is Google's recommendation: http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html . If those pages can be large and cause latency issues (slow loading) the better option would be rel next/prev and none of them would be "canonical" for the others, as they would each stand on their own. You may consider at that point adding a robots noindex,follow tag to the View All page, but Google generally does a very good job of figuring this out on their own and I prefer to let them do it. In summary: If you have good View All pages with fast load times use those as canonical, regardless of how many products you have (e.g. 5 or 25) as long as no latency issues are apparent. Use this tool to test it: https://developers.google.com/speed/pagespeed/insights . If the View All pages are too big for most of your categories to load fast go with Rel Next Prev. Rel Next Prev info: http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html View All Canonical info: http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html It can get a little more complicated if you are dealing with pagination AND faceted search or multiple URL parameters acting as filters.
| Everett0 -
How can i do SEO For Ecommerce site
Chandu I think your question is too big for someone to answer. Books and books and pages and pages of articles have been written about how to do SEO for an eCommerce site so you cannot expect anyone to provide you with an answer here. Perhaps if you tried to be more specific someone could help you. Yes, you should definitely rewrite the content for each product instead of using the manufacturer's duplicate content. Here are some resources for you: http://www.seomoz.org/webinars/ecommerce-seo-fix-and-avoid-common-issues http://www.seomoz.org/blog/qa-from-ecommerce-seo-fix-and-avoid-common-issues-webinar http://www.seomoz.org/blog/building-deep-links-into-ecommerce-pages http://www.searchenginejournal.com/7-ecommerce-seo-tips-for-2013/57530/ Good luck!
| Everett0 -
Partial mobile sitemap
Have a look at the resources below and let me know if you need more help. Responsive web design is recommended by Google but it does not apply to all websites. You have to take into consideration things like user intent, navigation, content. There is a lot of debate about it here on Seomoz and I would proceed carefully by reading some of the resources below. One thing I would advice though is to keep the robots blocked until you figure out what you want. In my opinion, Bryson Meunier has written quite a lot about the subject on several respected blogs in the industry. http://googlewebmastercentral.blogspot.com/search/label/mobile http://www.brysonmeunier.com/ http://www.seomoz.org/blog/seo-of-responsive-web-design http://www.seomoz.org/q/mobile-seo-tips-and-best-practices Hope it helps!
| echo10