Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Which is The Best Way to Handle Query Parameters?
The new pagination advice is really tough to navigate. I have mixed feelings about rel=prev/next (hard to implement, doesn't work on Bing, etc.) but it seems generally reliable. If you have pagination AND parameters that impact pagination (like sorts), then you need to use prev/next and canonical tags. See the post Alan cited. I actually do think NOINDEX works fine in many cases, if the paginated search (pages 2+) have little or no search value. It really depends on the situation and the scope, though. This can range from no big deal at all to a huge problem, depending on the site in question, so it's tough to give general advice. I'm not having great luck with GWT parameter handling lately (as Alan said), especially on big sites. It just doesn't seem to work in certain situations, and I have no idea why Google ignores some settings and honors others. That one's driving me crazy, actually. It's easy to set up and you can try it, but I wouldn't count on it working.
| Dr-Pete0 -
Homepage/Root domain de-indexed by Google
When I was in a similar situation where I didn't have the best of relations with the development company, I used Pole Position's free Code Monitor (https://polepositionweb.com/roi/codemonitor/index.php) to check the robots.txt files of the live site and any development sites/subdomains on a daily basis. I'd get an email if anything had changed, so I could go to the dev company right away and try to mitigate any problems.
| KeriMorgret0 -
Root domain ranks higher than sub pages
Thanks Dan, worked out it was google account login which showed the funny listing think i need someone to look at my links, as we always ranked well and had high listings, recently our pages have dropped off and being beaten by sites which dont have many inbounds and a less PR but ranking higher thanks for the comments though and spending the time to reply
| NickKer0 -
Removing some of the indexed pages from my website
It is a good point, you can 301 redirect them, but this is only valid if they have links. you can only 301 redirect requests. If they have links then it may be a good idea to leave the pages in place or reuse the same url with simular content.
| AlanMosley0 -
Quoting From Another Part of Your Site
Matt give a good answer, i can only add one thing, what is your aim? if it is usefull to users them ok, go ahead see matts answer, but if it is for ranking, then its not going to do anything for you.
| AlanMosley0 -
Merged old wordpress site to new theme and have crazy amount of 4xx and duplicate content that wasn't there before?
I would do the following; Double check your database using phpmyadmin Double check the FTP to make sure the pages are not outside your wordpress install. I see that your using an IDX plugin. I have used thses in the past. Make sure the IDX platform you are using did not duplicate anything on their end.
| entourage2120 -
How to copy link target from google serps
This article lists a bunch of Grease Monkey scripts that do this.
| EricaMcGillivray0 -
SEOMoz Crawling Errors
I just had a smack upside the head moment. I had all of the links in the main nav as absolute but my links in the footer were all relative. That was giving me my 404 errors. Hopefully this will get me back on Google's good side. We dropped in the ranks a little for some of our main keywords.
| TRICORSystems0 -
Hreflang on non-canonical pages
Wow, that's a tricky one. I haven't seen a lot of good data on rel="alternate" hreflang="x" yet, other than general suggestions that it's worth trying and seems to sometimes work. Technically, Google would say that the green/red/yellow versions aren't true duplicates, but practically, I think the canonical here is a good approach. My gut feeling is that the canonical will overpower the hreflang - adding the alternate language versions to the color variants won't hurt you, but it probably won't do anything. If it were me, I think I'd go with your first approach - just add the hreflang tag to the canonical version and leave it off the variants. Monitor that - see how it impacts your international rankings, and go from there.
| Dr-Pete0 -
Sudden SERP stop for main keyword.
two other possibilities. gwt says severe site health issue. robot.txt is blocking some important page. but is a .js file that website tonight claims is standard protocol and that it is just an image file. secondly i foolishly renamed my homepage and it caused a duplicate content issue. go daddyl/website tonight won't allow me to use a 301 redirect. but those wouldnt only happen to one particular keyword.
| VictorVC0 -
Adding 'NoIndex Meta' to Prestashop Module & Search pages.
I'd implement canonical tags for your duplicate content problem instead of noindex tags. This is the recommended practice for duplicate content. As far as pages that you don't want indexed, when you use robots.txt to accomplish this, Google can/will still put the URL & title in SERPS. In order to stop this, you do need to put meta noindex tags on every page.
| EricaMcGillivray0 -
Duplicate Page Title with Pretashop
The title tags on each page of the site should be different. If they all have " - ENA Shop" at the end, that's OK, so long as there is something before that that is descriptive of the page. I'd also consider changing "ENA Shop" to "ENA Nutritionals" since that seems like the better brand and the better keyword. I just did a crawl of your site and it seems that the majority of the pages have their own unique title. You do seem to have an issue with lots of 404 pages still being linked to, as well as lots of pages being indexed that should not be indexed, such as shopping cart pages. Run your site through a tool like Xenu (desktop), Screaming Frog (desktop), or the IM Ninjas Crawler (browser-based). They will give you a full list of pages that have broken links you need to fix, as well as tell you what pages are being indexed by Google that should be. Another good test is to do a search on google for site:enasport.com. If you see any pages listed there that don't have unique content, like shopping cart pages or "compare product" pages that will be duplicate content, you should noindex them.
| KaneJamison0 -
Can Microformat hCalendar publish a clients conference schedule?
Hi, In theory you can do it. Although we have been working with microdata vs microformats, after reading through the Google Webmasters Help it looks quite easy to implement. Check the example article @Google(2nd example is the one you are looking for). It should give an insight how to include it. I hope that helped, Istvan
| Keszi0 -
New to rich snippets, help needed
Hi Jason, First of all I would like to underline that you can only point out rich snippets on a product page, if they are visible for the user on the page. You can only point out hidden values such as telling search engines the currency you are using. If you have a product description on the page then you can include that in the rich snippet by inserting a few lines of code into the page. For further documentation I would recommend you to read through the following article: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=146750 Check the examples on the page, they have helped me a lot while I was working on our sites rich snippets. These should answer your first two questions. The 3rd question: there is a submission form from Google where you need to submit your site (please be aware that it can take up a month until you will see results) - We have seen results in a weekend. We were lucky! If you have further questions let me know, I am here to help Greetings, Istvan (nu este prea mult, dar este un start la care as dori sa lucrez cat mai mult inainte de infiintare)
| Keszi0 -
Microformats & Schema.org query
I think if the data is already there (your already showing product sizes) then adding rich snippet tags will not make any difference except notify Google that they are there and can be used in SERPS. We have just applied this to a few of our websites which are eCommerce, message me if you want to look at how we have used the tags.
| Lantec0 -
Getting Rid of Duplicate Page Titles After URL Structure Change
Thanks so much for your response. Honestly I'm still not sure why or what happened. Essentially the domain has a bit of history. It's been up since 2008 and is very well known authority figure in the niche. The site was taken down by the Govt in their domain seizure crap and held for 13 months. They had to give it back because they had no probable cause, and we were innocent. It's been a big issue in the media, especially when all the SOPA madness was going on but not sure if that 13 month disruption caused any issues for us. When we came back up we were doing really well, ranking very high and doing about 5k visitors a day from google. We had our highest google traffic day on March 4th and then for some reason we did a massive free fall. Google just left. When I searched site:dajaz1.com it showed paginated pages, category, tags, search pages ranking, it stopped ranking our post pages. I started going through the site to try and figure out what was wrong. I set everything that's not the home page or the actual post as noindex, follow, we started working on writing more, better titles, better seo urls, removed the pagination plugins etc. We're often the originators of content and sourced as such so it's not necessarily an original content issue. Google bot came back, but it's still not ranking us like it was and now ranking people above us who link to us, source us, or copy our content. When I check site:dajaz1.com now it's at least pulling posts, but it's pulling posts with numbers in the url, or very old urls. Our newer posts are not ranking. We were getting 5k-8k unique visitors a day organically, now we're getting less than 1000. I'm still at a loss as to why and how to correct.
| malady0 -
Crawler Stats
This article about canonicalization might help: http://www.seomoz.org/learn-seo/canonicalization
| Stevej240 -
What is the best way to fix legacy overly-nested URLs?
Thanks Alan and Irving, your responses are both very helpful. In reality, these pages have relatively few external links pointing to them compared to other sections of the site, so I think I will opt to redirect them. The newer sections of the site have a nice clean URL structure and good on-page optimization, so I think it's best to bite the bullet and move the older pages over to a new system.
| ThemeParkTourist0