Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Sitemap.xml showing up in Google Search
Proper On-page optimization doesn't guarantee you that your keyword will appear in SERP. For this you need to increase your Page Authority by getting some good links. Also according to your screenshot I see that you didn't use your keywords: ALT IMAGE (which + 20% of on-page opt for particular keyword) H1 (which + 30% of on-page opt for particular keyword) And make it or <bold>at least 1 time.</bold>
| smokin_ace0 -
Http and Https Update
Hi Matthew, You definitely want your pages to resolve to one version or another, either http or https. Don't leave it for the search engines to sort it out. For instance, take a look at Paypal, which redirects every single URL to https. Google and all major search crawlers can now handle https with ease, but if you place your content on 2 different URLs, this can count as dupe content. If the https pages actually redirect (via 301) to http, there is no issue of cloaking. Does that help?
| Cyrus-Shepard0 -
Long Urls/Google Webmaster Tools and SEOMOZ Pro
I assume to stop webmaster tools from crawling those pages, you've changed your robots.txt file? Our SEOmoz crawlers will obey the rules in the robots.txt file, and when your site's recrawled, those errors should drop away for the pages you've said not to crawl.
| EricaMcGillivray0 -
Well, I need some help, advice, something.
If you 301 redirect them to the main site, Google will recognize the change and remove any listings. Expect the process to take 30 days.
| RyanKent0 -
How does robots.txt affect aliased domains?
I'm assuming you can't 301-redirect (and that you still need the sub-directory versions to be reachable by humans)? I'm not sure the cross-domain canonical will work completely. I don't have a good example of a sub-folder to root domain canonical implementation. If the "sites" are identical, it should be ok. Robots.txt is going to depend a bit on how people access those. If there are links to the sub-directory versions, then blocking will cut off that link-juice (and the canonical or a 301 will be better). Blocking the sub-directory shouldn't automatically block the domain it aliases, too, unless for some reason that sub-directory is the only crawl path Google has to the outside domain. As long as they're crawling the outside domain separately, I think you'll be ok. I'm just not sure if Robots.txt is necessary here. Sorry, the devil may be in the details on this one. Happy to take a closer look in Private Q&A, if you want to give out some specifics.
| Dr-Pete0 -
How can I prevent sh404SEF Anti-flood control from blocking SEOMoz?
Frankly speaking there is no reason for you to keep the Security Function switch ON. I have around 20 websites done on Joomla CMS and all of them packed with sh404SEF since I found it the most accurate and bug free SEF component for Joomla. And during all this time I have been using it, the Security Function has been Switched OFF. Same I will recommend for you, it is useless. And remember the main purpose of sh404 is to make your website more SEO friendly but not to prevent from hacker attacks. So it's not an security component so far and still it might have a lot of bugs and mistakes. BTW I read somewhere in INTERNET that many people had the same problem like you, but with different robots, not only SEOmoz. So you should be also careful because it might make a negative impact on other search engines. I advise: Turn this sh*t OFF ;)))
| smokin_ace0 -
How can i increase my website traffic
I'd also start off by reading the Beginner's Guide to SEO. That will give you enough of a background to help improving your traffic, and to help understand the reports from your SEOmoz campaigns that are giving you tips as to how to improve your site. http://www.seomoz.org/beginners-guide-to-seo I'd look at making the title tags more for your site more descriptive than "Home" and "Battery", and also verify your site in Google Webmaster Tools to get a bunch of helpful information from there.
| KeriMorgret0 -
HTTP301 or link ?
I also ways say, go with what the SE gives you, we think we know a lot, but some times a page will rank well and you just don’t know why. Take it and run with it.
| AlanMosley0 -
My site cannot be found by google at all
I also notice that there is the following line in the code on the home page. I don't know if that's screwing anything up, but it's not a standard command as far as I know: You might try removing that and see if it helps. Simon also has a ton of good advice here.
| KeriMorgret0 -
Title Missing: Page is an Action
I wouldn't worry that it's missing a title tag; I'd just work on making sure that it's not indexed.
| KeriMorgret0 -
What is the best way to deal with pages whose content changes?
Excellent. That's what we'll probably end up doing. Egol and Alex, thanks for the responses.
| ChatterBlock0 -
Blocking AJAX Content from being crawled
Hey Phil. I think I've fully understood your situation but just to be clear I'm presuming you've URL's exposing 3rd party JSON/XML content that you don't want being indexed by Google. Probably the most foolproof method for this case is using the "X-Robots-Tag" HTTP header convention (http://code.google.com/web/controlcrawlindex/docs/robots_meta_tag.html). I would recommend going with "X-Robots-Tag: none", which should do the trick (I really don't think "noarchive" or other options are required if they're not indexing it at all). You'll need to modify your server-side scripts to do this. I'm assuming there's not much pain required for you (or the 3rd-party?) to do this. Hope this helps! ~bryce
| BryceHoward0 -
HTTP headers
It does not hurt the site directly, these headers can help you optimize your website speed by allowing the "Expires" tag to register to a future time so the browser can cache and make your site faster to load and navigate. So it depends on your niche and competitors. if all variables are equal and the only difference between the both site is speed. than you better optimize your sites speed.
| wissamdandan0 -
Best local listings submitting service
I just wanted to get other SEOs opinion about Localeze.com (Local Directory submission tool) recommended at; http://getlisted.org/enhanced-business-listings.aspx . We paid $3500 to use their system, entered few companies to test. We were very careful to enter proper and recommended NAP and categories. After 5 months, we checked 100 + directories they claim that the listing data would be sent. However only 5% of the places showed our listing. Has anyone used Localeze services? what are your experiences? We feel like we wasted our time and money usin Localeze services.
| CertifiedSEO0 -
How to fix and test Google's indexing / caching problem
Hi WebBizIdeas, It seems this is a known issue with your shopping cart software. This thread should help answer your question http://forum.cs-cart.com/topic/12883-problem-with-the-seo-addon/page__st__20 Hope that helps, Sha
| ShaMenz0