Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • The last time I used a tool, excluding via robots.txt was also sufficient for URL removal. Recently, Google has updated their documentation to strongly encourage you to use URL removal only for things like exposing confidential information, and not to clean up old pages or errors in your GWT account (see http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1269119). I know many people still use the tool for that type of stuff, but wanted to point out that change.

    | KeriMorgret
    0

  • LOL!  I love your wording!  That actually sounds great!  It makes it sound much less like a spammy cheap SEO pitch and more like a colleague trying to help out a colleague!

    | MarieHaynes
    0

  • Reliability ios a ranking factor, if you offline once,, you will be ok, but if the come back time after time you will drop in rank and will have to be relaible for a long time to regain trust, if you are offline long enouth, you will be de-indexed

    | AlanMosley
    0

  • You are welcome. Here is what I found for you: http://www.casedetails.com/2011/02/10/what%E2%80%99s-the-right-number-of-outbound-links/ I regularly see pages with tons of outbound links that have much trust. However, in your case, I would recommend to focus more on your visitors' impression. CTR and time on site are also important. If your site is cluttered with links - they will feel confused and leave.

    | SlavaRybalka
    2

  • Thanks all, it makes lots of sense but it brings me to ask another question... What do I put in the Blog and what do I put as an article in my articles section?

    | BeytzNet
    1

  • Looking at http://googlewebmastercentral.blogspot.com/2011/12/new-markup-for-multilingual-content.html and http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077, you would only want to implement rel=alternate tags when the content is duplicate content.  So if you write new content in a different language on a subdomain that doesn't appear on your main domain in a different language, don't specify a rel=alternate.  If it's a translation of content on your main domain, go ahead and add it. The difference in the implementation when you have pages with completely different languages vs. small regional differences is if the pages are in completely different languages, you won't want to implement canonical tags pointing to one of the variations. Re-reading through those articles, it's not 100% clear to me what adding this does when you don't implement the canonical tags.  Google says "This markup tells Google's algorithm to consider all of these pages as alternate versions of each other."

    | john4math
    0

  • Ok i see, If you were to 301 them, you will end up with a lot of 301's very quickly, why not just leave them there or delete them, Idealy i would have the pages generated from a database, and when the are sold the page can 404. As i said a 301 will not help you if they have no incomming links. If you really wanted to be tricky, you could have a 404 pages that has a auto search that takes the url assuming the url has product info in it, and searches for simular products to display.

    | AlanMosley
    0

  • Natural backlinks are links that are created to your site without you doing anything to request or buy them.  They could be in content, in sidebar or anywhere on the linking site.  These links are usually EARNED but sometimes the linking site is scraper that does not wash out links.  Natural links are usually high quality but they might not be. Quality backlinks are links that might provide traffic or help your site advance in the search engines.  They can be high PR links on relevant sites but they don't have to be.  A link from an obscure page on the Pope's site might not be relevant and might not be high PR but I bet it would help your Google rankings.

    | EGOL
    0

  • Hi Cody. I just got in on the tail end of this. I'd also put keywords in the file name. Check out a previous thread on this.

    | AWCthreads
    0

  • H Naghimiac, Many directories charge, but this doesn't mean they are black hat. The key concept is editorial inclusion. A directory that accepts anyone is not a directory you want to be associated with. This includes directories filled with porn, gambling and payday loan sites. On the other hand, the harder it is to get into a directory, the more value it usually passes. This is true even when the directory charges money for "review" services. Be careful - directory listings are meant to enhance your backlink profile, not act as a foundation. Here's a helpful article: http://www.seomoz.org/blog/seo-link-directory-best-practices Best of luck!

    | Cyrus-Shepard
    0

  • I think that's the way to go, my brain on a Friday isn't coming up with anything else! I will experiment and see what works and report back.

    | speedyseo
    0

  • Ok Forget keyword density, it only affect the site in a non user friendly way. This is what you should focus on(in my opinion), I call this the three simple keyword rule. Here we go: The title must have the keyword in it. eg: title: Rules for Friends with Benefits The heading shall reflect the title but in a more focused and detailed way, eg: These are the rules for friends with benefits. The content delivers the answer to the heading. eg: You will naturally include the keyword in the text content since you structured the article around the heading. Make sure that you start with a summary that reflect the title. Let's break this down in a descriptive way: The title describes the heading and content in a summary way with no more than 65 characters. this keeps it focused and relevant. The heading explains the title in a more detailed way. The content reflects the title and heading in a descriptive and informative way that has a summary that reflect the title and the full length article reflects the heading. There we are, now try to follow these rules and you don't need to wonder about the keyword density and you'll automatically create good content that is visible and often better than the competition in the SERP. Best, Gustav

    | Gustav-Northclick
    0

  • have decided that I'm just gonna try and see how it reacts, will report back when I have the conclusion.

    | ReneReinholdt
    0

  • Does this compression impact pagespeed in a positive way?

    | Naghirniac
    0

  • Yeah, thats the Idea a forum of all forums : )

    | SEODinosaur
    0

  • Nicholas: I'll check with the good folks at SEOmoz support. Might take a while to get an answer. I'll let you know what I find. Cheers - Axel

    | axelk
    0

  • Hi Karan, No, having a numeric digit within the domain name would not harm your SEO in any way. Good luck!

    | ClickConsult
    0

  • Hi, I will ask a private question in March. Thank you very much.

    | salvyy
    0

  • From what I've read, they're indexing Facebook and Disqus comments on blogs.  On the site I work on, on certain pages we load the main page contents via AJAX, and Google is reading and indexing that content. I'd be wary of taking the time to put together HTML snapshots, as Google is getting better and better at this.  You could set up a prototype of your AJAX solution and wait and see if Googlebot can index the AJAX loaded content.

    | john4math
    1