Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Medium sizes forum with 1000's of thin content gallery pages. Disallow or noindex?
Hey Chris, I agree that your current implementation, while not ideal, is perfectly adequate for the purposes of ensuring you don't have duplicate content or cannibalisation problems - but still allows Google to index the UCG images. You're also preventing Googlebot from seeing the user profile pages, which is a good idea, since many of them are very thin and mostly duplicate. So, from a pure SEO perspective, I think you've done a good job. However... I think you should also consider the ethical implications of potentially blocking the image googlebot as well. By preventing Google from indexing all those images of young girls fawning over the vacuous runners up of a televised talent show, you would undoubtedly be doing the world a great service.
| PhilNottingham0 -
404s in GWT - Not sure how they are being found
Kelli - the first thing I thought was what garfield_disliker asks: have you set up Google Webmaster Tools to ignore these parameters that are important for the cart page to load? That said, Google Webmaster Tools is run by a team that's separate from the primary search team, so it's possible that GWT is flagging an issue that isn't an actual issue for Google. Run a search in Google for "site:yourdomain.com/UpdateCart" and see what URLs Google has indexed. If they have that 404ing URL, that's not good. If they have correct URLs, it's possible that this is a Google Webmaster Tools thing.
| KristinaKledzik0 -
Importance of 301 Redirects
Thanks for all the great responses so far. The site was registered in 1997 and the current site has been up for 6 years. Here's the link profile for the site: Domain Authority - 32 Domain MozRank 4.08 Domain Moztrust 4.86 Esternal Followed Links 258 Total External Links 271 Total links 59,852 Followed Linking Root Domains 71 Total Linking Root Domains 76 Linking C-Blocks 46
| stageagent0 -
Excessive use of KeyWord?
Thanks for your Answer Anthony. If google is not too phased by this, Can I add the words "South Africa" to the new pages i will be making. 2 reasons for that is: 1, consistency. 2, better targeted keyword optimisation. p.s. I think in all honesty, by now google is pretty much aware that my site is south african related, and can make it's own algorithmical conclusions for placing us for SA related quires, but for consistency purposes. what do you think?
| NikitaG0 -
Bogus Crawl Errors in Webmaster Tools?
Thanks, Marcus, My numbers are rising rapidly right now... but hopefully the trend will reverse. I'll let you know if I learn anything.
| EGOL0 -
Steady, but continous Google traffic drop. Help?
Well as for the 404s, I would run the site through Screaming Frog and see what I get. If you find 404s there, then redirects and edits will be needed to fix them. Otherwise forget about the GWT report. I think the most important thing for you to do (besides an overall marketing campaign) is fix those duplicate page titles and descriptions. This is a huge factor, undoubtedly. A PPC campaign (executed properly) can always help, but you shouldn't lean on it as your fix to your marketing campaign. It is only another spoke in the wheel of your overall efforts. Hope this helps.
| jesse-landry1 -
How do I find which pages are being deindexed on a large site?
Hi Daniel Yep - as Mat says there's no official solution to this. Do you mean deindexed by Google (without you wanting them to be) or deindexed by you on purpose? I suppose you could also; crawl your whole site depending how big the site is, do a site: search in Google. use the SERPs redux bookmarklet - get all indexed URLs in a column in a spreadsheet compare your crawl vs. the list indexed and whichever was not present in the SERPs could have been deindexed this method is faulty as it assumes all crawled URLs were indexed in the first place - but could get you part of the way there. -Dan
| evolvingSEO0 -
Categories in Places Vs Local
Hi Miriam, I think budget issues are always a consideration. I'm with you: if the owner doesn't get that this is a long term effort to increase their business, then I tell them I can't help them. I tell them that my job is to increase their business and that I need a commitment of time and money to do so. I generally try to get them to commit for a one year period and a budget large enough to actually accomplish something. These kinds of customers are harder to find, but, once I do, because I have the time and money to get a result, they tend to become more or less permanent customers. My biggest challenge is explaining what I'm going to do and how this will result in more business.
| waynekolenchuk0 -
Self inflicted duplicate content penalty?
How many pages do you have total? If it is a small number this is an easy fix. A lot of the time people trust content to be unique and don't check, I myself am guilty of this. Thanks for reminding me that this can and does happen.
| PatrickCoombe0 -
Robots.txt Download vs Cache
No to my knowledge. You will have to wait. Anyway, Google could have already download the new robots but while the reports are showing the older file. Those reports always take a while until refreshing completely.
| FedeEinhorn0 -
Target="_blank"
Short answer: Highly doubt it. Long answer: I haven't seen anything from any of the ranking factors Moz, or anyone else produces, that would indicate so. And under normal linking I can't imagine a situation that would cause a User based issue (like increased bounces because you got launched in another window), unless it's really excessive. For example, I deal with professional photographers sites, and they sometimes have a portfolio site, and a blog, and sometimes they are two different systems with two different experiences and often times open the other in a new window. That is very annoying and a can kill a user's experience and they end up with high bounce rates. I think if you're just sending links to other useful sites, target="_blank" is perfectly fine to use. I do it on my site all the time.
| WilliamBay0 -
Sites for English speaking countries: Duplicate Content - What to do?
From what I am hearing, nothing much will change across countries, so why geo-target at all? Are you going to be developing any content that is different per country? Assuming you should geo-target, I'd recommend subfolders (domain.com/uk, domain.com/us, etc.) as you can then use some of the equity from the main domain. Then use WMT in Bing and Google to geotarget those folders to the countries in question.
| katemorris0 -
Are the duplicate content and 302 redirects errors negatively affecting ranking in my client's OS Commerce site?
Hi Chris, There are 1259 temporary (302) redirects (the web designer has a problem changing them to 301 redirects and so has left the site like that). These redirects all point to the same page - the log-in page - (as one can not write reviews on products unless logged in). I am concerned that this may look like spam behavior and may be negatively affecting ranking. I feel it is up to the web designer to complete the site (and remove the 302s) however the web designer considers their job complete. I am trying to decide weather to take on the challenge and try to sort out the site in osCommerce (of which I have no experience) or weather it is better to start a WordPress blog on a fresh domain to attract traffic until the site owner is ready to rewrite the site in a more SEO friendly format. Currently the site is indexed but will not rank for even the company name. Thanks so much for you help. Alison
| Web-Incite0 -
Schema.org implementation for physician's office vs physician herself?
Thanks I really appreciate your reply!
| Titan5520 -
I have an eCommerce store with a lot of 301 redirects. Would that hurt my rankings?
Having the redirects in place should not harm you in any way unless you are chaining them, just watch this Matt Cutts video about the limits on redirects. Regards, Chris Wilson
| Chris_CM0 -
404 not found page appears as 200 success in Google Fetch. What to do to correct?
Unfortunately, it doesn't work that way, Alex. Unless the server returns an actual 404 response code in the header, search engines will not consider the page to be an error to be removed from their index. Even though the content of the page may look like an error page, the http response in the header is the only thing that determines whether the engines will treat it as a 404 error. Paul
| ThompsonPaul0 -
Slow website
Thanks to you both I disabled total cache and ironically site loads a bit quicker. and I will look to combine the .css's and js's into single files if poss. Thanks again!
| StephenCallaghan0 -
Duplicate pages on wordpress
Hey Thomas - thanks for jumping in! Just to clarify; Yoast will allow you to noindex categories - I'm sure that's what you meant. You can delete categories in the default WP setup. And I think you mean "noindex all subpages of archives"
| evolvingSEO0