Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hmmm... feels like I'm misunderstanding part of the question here. To get your AJAX content indexed, your server needs to return an HTML snapshot to the crawler, so the simple (or not-so-simple) answer is to inject a meta robots noindex, follow tag into the HTML of the snapshot, just like you would a regular HTML page. How you do this depends on your technology choice and server configuration, (which unfortunately I probably can't help you much with) If you'd like, feel free to leave a few more details about your particular situation and the solutions your using, and perhaps another community member can chime in.

    | Cyrus-Shepard
    1

  • Howdy! Did this responses help answer your question, or are you still looking for some more assistance?

    | KeriMorgret
    0

  • Thanks Alan, He isn't using text-index:-9999, that's one plus. I don't know if it's an option to go back and re-size 3000 images. I agree with you, that would be the best option.

    | SeaDrive
    0

  • You seem to have evaluated things well, i will give you one reason why yopu should not, links pointing to the non indexed pages will be pouring their link juice away. , you are better off using a no-index,follow meta tag, at least then the link juice will flow back out of the links when followed. Robots text is a nasty tool to use, you need a more surgical approch

    | AlanMosley
    0

  • Does that page have more links? or good quality links? go with what google gives you, if that page ranks well for the keyword take advanatage of it.

    | AlanMosley
    0

  • using subdomains for locations is a technique that spammers took advantage of and as a result is no longer recommended. Use directories/folders or simply targeted landing pages in the same folder. a) You can code it so that you don't show a global phone number in the header b) it's your site, you don't have to create links that take the user to deeper location pages if you don't want to c) Regardless you need to create unique content anyway, if it's on a subdomain or directory it's still using the same template and you need totally unique content for each page, don't just swap out location names.

    | irvingw
    0

  • Hey, The same basics apply but I've found that links to the domain and authority of the domain matter more than page authority. In terms of fast indexing, make sure you have an RSS feed. Also, if you are trying to get accepted in to Google News, do make sure you have met those requirements: http://support.google.com/news/publisher/bin/answer.py?hl=en&answer=40787. Having a news specific XML sitemap is also helpful: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=74288 Why would you remove the content a day or two after publication? If you have an online newspaper, that content may still be relevant for some time after the original publication date. I've tended to keep news content around years after original publication and still gotten some traffic to those articles. Given that, I've not has experience removing content that quickly but when I have removed content from news websites, I try to redirect the old url to the most relevant new article (if possible) or news category page (usually easier to do). This preserves link value, which is obviously important. For really old stories that don't have links or traffic, I set the article to return a 410 status on that URL to signal that page is removed permanently. Again though, I'd default to keeping the article around as long as possible. Hope that helps.

    | Matthew_Edgar
    0

  • At first I was leaning to disagree. But couldn't find anything to back it up. I know some seo tools have ip address checkers to make sure your in a good ip neighborhood and block. I thought this would fall in the same line as what was originally asked. "A clean IP address. You might think your site is squeaky clean, but did you know your IP address has a dirty secret? That’s right. Unbeknownst to you, your neighbor is into some very shady things. Check to see if you’re in a bad IP neighborhood by running an IP checker like MXToolbox.com." The entire post can be found here: http://www.bruceclay.com/blog/2011/04/the-seo-bucket-list-3-things-to-do-before-your-site-dies/ So because of these tools checking your ip block I was always under assumption that a shady site on a server could in fact have some impact on your rankings. However I am hard pressed to find any evidence to support this. Heres another post from above reference: "About 3% of all web sites "own" a private ip number, with the remainder being on virtual, or name-based, servers. Although only 3% are dedicated ip's, we have seen that in many instances well over 90% of the top-50 results in the search engines are sites having dedicated ip numbers. This was so strange that we have repeatedly validated these findings, and have found that switching a site from a virtual ip to a dedicated ip number alone has caused significant ranking increases. Of course, the web is so dynamic that this could be coincidence, but we do not think so. Likewise, we have found that there are "dirty" ip c-blocks, ranges of ip numbers that have been tarnished by spammers and left to be reassigned to unsuspecting sites. If your site is in the range of the spammers ip, then you are equally penalized. We have likewise found instances where simply moving a site has caused the ranking to improve." Im glad I came across this post. Learn something new everyday. Did I misunderstand the statement on the above mentioned article or is this perhaps an outdated theory? Thanks in any event for teaching an old dog something new.

    | anthonytjm
    0

  • "Agents" Thanks very much for the answer and appreciate the help... I think in the case of this site, since it is a very simple blog, swapping the tags and categories won't completely break the site or the SEO. But for the sake of preventing any misunderstanding in general, categories and tags are fundamentally quite different. Many people do swap their use, and get away with doing so because the sites are simple. First, refer to the "Categories vs. Tags" section in my post on the Moz blog To reiterate the basic differences here: Categories Can have a hierarchy (be nested as parents and children) Are used best as a "menu" - like if the entire website is a blog, the categories are designed to fit into your main menus - this is built right into the wordpress functionality. This menu can be the main menu at the top, or in the sidebar A category archive can be the homepage 5-8 categories (the main buckets of topics on your blog) is best. If you want more, you can nest them Categories always existed, from the beginning of wordpress When used in this proper fashion, the category archives should be indexed by engines. Tags Tags have NO hierarchy (they can't be nested) They are not to be used in the menu of a site - the main menu or sidebar menu. You see them a lot as a "tag cloud" in the sidebar or at the bottom of posts. They work better as this type of navigation, if users want to find articles similar in a much more specific way. A tag archive can NOT be the homepage You can use as many tags as you want, but they should not be the same as categories. They should be more detailed than categories. Tags didn't always exist from the beginning of wordpress, they were added after. When used in this best practice way, I always advice to NOT index tags (at least at first). Tag pages will be thin, almost duplicate and of little use to the searcher. MUCH better landing on a category archive or better yet a post its self. -Dan

    | evolvingSEO
    1

  • Hi Naveen, Did you look at what competitive sites now outranked you and see any patterns? Did the drop in organics corresponded with a Google update? Did you change anything on-page of the landing pages? Just some things to look at.

    | KevinBudzynski
    0

  • Great advice.  Not thought of that. Thanks so much.

    | stevecounsell
    0

  • Maybe submit a support ticket to SEOmoz to see if the 404's might have been false positives.

    | AdamThompson
    0

  • Wow! That's really sucks. Negative guy has no soul, heart or integrity. Ive often heard from other seo peeps in the industry that its not what they do to their site that makes them rank well, its what they find wrong with competition and report to google that gets them the best return. The negative guy in forum post took this to a whole other level. Id love to hear what google has to say about that.

    | anthonytjm
    0
  • This topic is deleted!

    0

  • thanks mr seo, i've just cancelled the php redirtect code.... it must be the source of my seo problems.... anyway, i would like to create different sites, but if somebody would like to see the english version? in that case, should i put a link to the english site? but should i redirect every single page to the corresponding ? or is it better to put a generic link that points to the english home page ? thanks from a jounior

    | guidoboem
    0

  • Yes - I can say from experience that this is true.  You'll also feel much more inspired to keep working towards 'coat stand', after you've achieved the mini-success of already ranking for 'cheap coat stand'!

    | AgentsofValue
    0

  • In terms of SEO there is no extra advantage of it! And at the same time this sounds a bit difficult. In my opinion the domain name should be short, simple and easy to remember instead of all this shortcut tactic…

    | MoosaHemani
    0

  • Hi Cyril, I've not seen any specific case studies or statements that figure out how much replication is considered duplication, although I'd be as interested in you in this information if anyone out there knows of some? Personally, I have been working on a 50% minimum basis (e.g at least 50% of the pages written content should be unique), and it's been working well for me. You might get away with less. In regards to making sites an authority within their specific countries, whilst using duplicate content and the canonical tag, it's all down to links. Even though you're telling Google that this isn't the original source of the content, it's still possible to build up authority on the domain by acquiring links from strong sites that share the TLD. But again, without unique content you're not going to see the full strength of these links. David

    | mrdavidingram
    0
  • This topic is deleted!

    0