Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi Dave, Just make sure you do all the usual SEO work on new content, tags and headers. Keyword density (although not too important but still effective). And combining older content, just do redirects from the old URLs and that will help with any problems for backlinks. The most important thing I emphasize on when moving to a new site structure is the redirects, and I think most would say the same.

    | William.Lau
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • Technically, rel=prev/next doesn't de-duplicate the way the canonical tag does, but it should solve any problems for Google. I don't believe we currently consider rel=prev/next when determining duplicate titles. Klarke is right - you could just give those pages semi-unique titles. We're not handling rel=prev/next as well as we could be (it turns out to be a tricky tag to parse well). Looking at your pages, your implementation appears to be correct. My gut reaction is that your probably ok here. You're doing what Google claims they want (at least what they want this week).

    | Dr-Pete
    0

  • I agree with Matt - as long as your primary, internal links are consistent, it's ok to use a short version for offline purposes. The canonical tag is perfectly appropriate for this. The other option would be to use a third-party shortener that has built-in tracking, like Bit.ly. It uses a 301-redirect, but also captures the data. If you're just doing a test case, this might be easier all-around.

    | Dr-Pete
    0

  • One of the examples scenarios Google gives is: Your pages have broadly similar content within a single language, but the content has small regional variations. For example, you might have English-language content targeted at readers in the US, GB, and Ireland. Tough call, you might have to do some research to see if this solution will help in your particular scenario.

    | David_ODonnell
    0

  • Let us assume that I have already processed all URL parameters on Google Webmaster tools. The question is will SEOMOZ see those URL Parameters changes or they will override this? Sorry for the confusion. To answer my own question, those URL parameter I presume will only work for Google. It is a way to tell Google not to crawl those URLs. If I let seomoz crawl my website, what should I do to let Seomoz bypass those unnecessary URLs? I am not really an SEO Guro but I am willing to learn.

    | paumer80
    0

  • First, I've done a lot of full site 301's - meaning all pages on the site to new domain pages, not domain to domain - and never lost 10% (usually 1%). Second, the answer was based on how you wrote the question. You could simply redesign one page (one with less PR, PA, etc. would be suggestion) and redirect it to the other. Just do not change anything in that url. Hope it helps, Robert

    | RobertFisher
    0

  • Hi Matthew, I think your suggestion is great, as it would instruct google of the fact that I'm aware I have duplicate content, and want to tell him which one should be prioritized. However there are other aspects that would not be fixed in terms of seo that I'd like to get fixed with this "project" If I keep all versions active with this canonical tag, I'm still spreading my incoming link juice through different subdomains, thus minimizing my potential domain authority... For this reason I'm still thinking of making the real big & bold move... which is still giving me goosebumps!

    | DuProprio.com
    0

  • Thanks Derek! Since it's a pretty competitive market I think I will have to build links directly to the office pages. The problem is that directories often don't allow deep linking. I will have to work hard to find the ones that will allow it.

    | optimumweb
    0

  • Hi, There are a few different ways to answer your question... First, those pages are technically duplicated. If you go to either URL they return the same content with a status 200. That should be easy enough for your programmers to correct. They need to detect if the URL contains "php"  and, if so, 301 redirect to the ".html" version. Remember though that it isn't just the extension that is different. In the HTML version you have dashes and underscores in the PHP version. However... Second, the real question for duplicate content isn't "do I have it?" but "where are the multiple URLs?" You want to know how SEOmoz found that ".php" version if you haven't linked to it internally. The way to do that is to download the CSV file in the upper right hand corner of the duplicate content page. Column L (the duplicate page content column) tells you "TRUE" if that page is duplicated and column AM contains the referral. You want to check that referral source to see how they found the link. Finally, check to see if Google and Bing have indexed these PHP pages. Also, check to see if you've gotten traffic to the PHP pages. If they haven't been indexed and you don't have any traffic, you probably are okay. But, I'd still investigate to find the source and then redirect the PHP version to the HTML version. (Edit: check external links to...is anybody else linking to these ".php" pages?) I hope that helps. Thanks, Matthew

    | Matthew_Edgar
    0

  • I am quite late to add my reply on this question. Because, I was busy to fix issue regarding dynamic URLs. I have made following changes on my website. I have re-write all dynamic URLs and make it static one exclude session ID and internal search option. Because, I have restricted both version via Robots.txt. I have set canonical to near duplicate pages which Dr.Pete described in Duplicate content in post panda world. I want to give one live example to know more about it. Base URL: http://www.vistastores.com/patio-umbrellas Dynamic URLs: It was dynamic but, I have re-write to make it static one. But canonical tag to base URL is available on each near duplicate pages which are as follow. http://www.vistastores.com/patio-umbrellas/shopby/limit-100 http://www.vistastores.com/patio-umbrellas/shopby/lift-method-search-manual-lift http://www.vistastores.com/patio-umbrellas/shopby/manufacturer-fiberbuilt-umbrellas-llc http://www.vistastores.com/patio-umbrellas/shopby/price-2,100 http://www.vistastores.com/patio-umbrellas/shopby/canopy-fabric-search-sunbrella http://www.vistastores.com/patio-umbrellas/shopby/canopy-shape-search-hexagonal http://www.vistastores.com/patio-umbrellas/shopby/canopy-size-search-7-ft-to-8-ft http://www.vistastores.com/patio-umbrellas/shopby/color-search-blue http://www.vistastores.com/patio-umbrellas/shopby/finish-search-black http://www.vistastores.com/patio-umbrellas/shopby/p-2 http://www.vistastores.com/patio-umbrellas/shopby/dir-desc/order-position Now, I am looking forward towards Google crawling and How Google treat all canonical pages. I am quite excited to see changes in organic ranking with distribution of page rank in website. Thanks for your insightful reply.

    | CommercePundit
    0

  • Perhaps the OP was doing a site command with www. in the URL as per his message. But yes, the site has definitely been crawled and indexed. If you want to see some actual crawl stats, verify your site with Google Webmaster Tools: http://www.google.com/webmasters/tools/

    | David_ODonnell
    0

  • Your requirements for a software are mutually exclusive by definition. White hat 99% of the time means a method you have to put some effort into (research your market, find partner websites, create good content etc). Using a software means automation (no effort). Hence the dissonance. +1 to what Klarke and Egol wrote.

    | Peke
    0

  • Hello Le Nam, Keri asked me to stop by this thread and reply to your question. What you have highlighted in your screenshot is typically known as a 'blended local result'. What this means is that it is a search engine result drawing information both from the business' website, as well as third party sources such as Google Maps or Google+ Local. In the past, Google has used a variety of displays for local business information. These days, blended results like the one you have pointed to have become the most common choice of display, either as a single result on a page of organic results, or as part of a list of 3-7 other blended results. From your question, I believe you would like to know how a business achieves such a result. The final decision on who gets listed in this very visible manner is up to Google, but a short list of things you can do to work towards achieving this type of listing would be: Have a strong, locally-optimized website, meaning you've got your complete business name, address and phone number (NAP) listed in appropriate places on the site (such as the sitewide footer and contact page) and that your geographic terms (like Pizza Restaurant Ha Noi or whatever they happen to be) are reflected in your site's titles and tags and than you've got good strong copy talking about your local goods and/or services. Create and claim a Google Local profile for your business. I looked up zinaki.com and it appears you are in Vietnam (I apologize if I am mistaken about this), and because I could not reproduce the results shown in your screenshot, I am not 100% certain about exactly how Google is handling the recent migration from Google Places to Google+ Local in your country. In any case, you need to discover this and get listed in Google's local index, either with a Place Page or a Google+ Local page for your business. Then, you will want to start winning reviews from your customers, though Google and other review sites. And, you will want to get your business listed in as many free local business directories as possible. Make sure your business details (NAP) are consistent across every local directory. These listings are commonly called 'citations'. Finally, you may find it necessary to do some linkbuilding to assist with your organic rankings so that your combined organic and local data will hopefully result in a blended listing like the one you've shown. I sincerely hope this helps, but don't hesitate to let me know if you had something else in mind with your question. Good luck! Miriam

    | MiriamEllis
    0

  • I will do this also. Thank you.

    | snappyuk
    0

  • Samuel Mat's right - you need to speratae the "access denied" pages into two buckets; those which you want indexed those which you don't want indexed. For those you want indexed, take them on a page by page basis to figure out why access is being denied. They might not all be because of password protection. -Dan

    | evolvingSEO
    0

  • Actually, I got the POV = Point of View OK, it was the Miveu = My View I struggled with a little. Like Fransisco, I don't see any problem with the videos - other than some are not very good. David, I think your product is really cool and I'd expect that a lot of your customers are going to be posting videos they have made with all kinds of keywords. If I were going to market your idea, I'd have a blog on your site and embed some of the coolest YouTube videos that your customer make in some of the posts. A good blog that you update with really good stuff will be great for your SEO anyway. This is a product you should be able to have great fun with!

    | David_H
    0

  • Make sure to mark as answered

    | ak1lz
    0