Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
GWT Malware notification for meta noindex'ed pages ?
Thanks for sharing your views Moosa.
| Saijo.George0 -
SEO Friendly Calendar System
I'm really impressed with the Tribe Calendar- I've been using it almost a year for a community run arts initiative Mass Culture and its working out really well. Not only does it look great but in terms of SEO it has he most complete functionality I've yet found. Also the new Geolocation information really assists in local search too.
| Robin_Jennings0 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I agree with Chris. With such large websites it would be advisable having a sitemap index and then splitting the index into various individual indexes such as Pages, Products, Categories, images, media, tags etc.
| Robin_Jennings0 -
404 or 503 Malware Content ?
Thanks Peter, apologies for the delay was tied downed with some other things. Your help is much appreciated.
| Saijo.George0 -
Lowercase Rewrite in web.config. Strange behavior.
Hi Greg, great to hear that it's working ok and was a cache issue. Thanks for the update!
| Aleyda0 -
Web address change - Search impact?
Hi there, I see that it's whosjack.org that you have redirected to http://wjlondon.com/ at the moment, so I guess you have already done the migration? I see that you already have some internal pages indexed in whosjack.org, although you have also 301-redirected them to http://wjlondon.com/ (check if all the internal links go to the new URLs, also use the change of address option in Google Webmaster Tools). Ideally, especially at the beginning, you should focus on a single domain, to consolidate your content and popularity. Additionally, you should avoid featuring the same content on two domains, since it would cause content duplication issues. For domain migrations, take a look at these best practices, where I also link to other resources that will help you to plan and implement to minimize any errors and negative SEO impact. I hope it helps!
| Aleyda0 -
Are W3C Validators too strict? Do errors create SEO problems?
Google is a different case being run through the validator. I actually read an article on why google's site do not validate. The reason is that they send so much traffic, it actually saves them a good amount of money not closing tags that do not matter. Things like adding a self closing / to an img tag and the sorts. While I do not think that validation is a ranking factor, I wouldn't totally dismiss it. It make code easier to maintain, and it has actually gotten me jobs before. Clients have actually ran my site through a validator before and hired me. Plus funny little things work out too, someone tested my site on nibbler and it came back as one of the top 25 sites. I get a few hundred hits a day from it. I will take traffic any where I can get it.
| LesleyPaone0 -
Did I cause my 70% drop in organic traffic?
Hi Danny, did any of these responses answer your question?
| Christy-Correll0 -
How to avoid plagiarism and theft of content on article directory sites?
Does this apply to reputed article sites as well? We were under the impression that posting on some of the more well known ones would be a good way to start on our link building campaign. Obviously having second thoughts about this now. So should we stay completely clear of trying to link build even through the more reputed article submission sites?
| suchde0 -
:8088 showing up on end of URL in natural Google search results
Thanks Lesley, yes we do have a mac server set up. Will get our host to fix asap. Appreciate the quick response and info! Regards.
| costumebox0 -
Implementing Cannonical & Alternate tags on a large website
JavaScript is client-side code that's generally rendered in browsers, which means Google crawlers typically won't see it, and the almost definitely won't process it for these kinds of directives. You can create these tags dynamically, but you need to do it with a server-side scripting language, like PHP, ASP, etc. That's a common practice, and many large sites dynamically code canonical tags, META ROBOTS, etc. (I've done it on many sites).
| Dr-Pete0 -
Google index dymamic webpages after block in robots.txt...
Unfortunately, Robots.txt is a poor choice for content that may have already been indexed, including dynamic content. It's good for blocking specific pages and folders (especially prior to Google crawling them), but it tends to be unreliable in these situations. Pagination is a tricky topic, and the "best" solution varies a lot with the situation, but the basic options are: (1) Use rel="prev" and rel="next", which helps Google handle the paginated series properly, but still allows it to rank. (2) Use META NOINDEX, FOLLOW on pages 2+ of search results (this was probably the most popular method before rel=prev/next). (3) Use rel=canonical to point all paginated results to a "View All" page. This page should be available to users and not be too large. It's a decent option if you have a few dozen results, but not 100s or 1000s. (4) Use Google Webmaster Tools parameter handling on the "page=" parameter. It seems to work, but since it's Google-specific, it's not the go-to option for most SEOs.
| Dr-Pete0 -
Is it possible to export Inbound Links in a CSV file categorized by Linking Root Domains ?
You are correct. Excel should be able to match on that example or one similar to it. You may need to do some reading within the Microsoft help pages to see how it uses regular expressions (for example I dont know if you need the asterisk in your query). Just mess around with the function and you should be able to figure it out.
| CleverPhD0 -
Website being crawled but not indexed any thoughts?
Your absolutely right Chris, I was in the process of rewriting all the content and going to be getting the blog going. Also now looking at quality links etc. Google places done, so was trying to get some local citations. Ant
| Ant710 -
How to best keep client hosting separate but manageable?
Why can't you just use a large vps account with whm / cpanel and give each an account. As for the different servers, just use different ip addresses an no one will ever know. I would look into a set up by either webhostingbuzz.com they have a couple data centers in the US and Europe or OVH if you know how to manage a server.
| LesleyPaone0 -
Keyword stuffing
could you please look at my site http://vervo.lv/ the mooz page grade says i have 72 keywords in page but i see just 9. So where is other keywords. Keyword is "Kravu pārvadājumi" or "starptautiskie kravu pārvadājumi" i yet did not decided which one use for first page. I thought the problem is using many targeted menu links but when i disable them nothing chages.
| vervo0 -
Link profile
1. Yes, worry. 301 those old page to your new page or new web site. 2. I don't know if this is a question but I'm assuming you're talking about the links from the first question and whether they are meaningful. I say yes. Nofollow links are a critical component to a solid link profile. 3. I would not worry about this. Again, a natural link profile will have anchor text that is not ideal.
| BrianJGomez0 -
Rich Snippets for recipe pages not appearing in Google
Having the rich snippet is not a guarantee Google will use the information. We see that all the time especially with author tags. Providing the information to Google is just telling them it's available, as they tweak their results it tends to move around a lot. It sound like you checked webmaster tools to see how your mark up is perceived with Google.
| BCutrer0 -
Rel canonical question
Hi Dana How do I change such rel canonical for Volusion store? Thanks in advance.
| admonster0