Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.

  • This topic is deleted!

    0

  • Exactamundo! Here is the high commander Matt Cutts with a run through on how it all works http://www.mattcutts.com/blog/canonical-link-tag/. Cheers, Ross

    | rtavs
    0

  • I haven't dove deep into the contents, but I'd look at their link profile and the keyword competition for the specific terms you've used to pull those those specific pages

    | danbocain
    0

  • Hi Daniel, I was thinking about it due to very less number of indexing. Thanks for your suggestion.

    | CommercePundit
    0

  • Thanks, I emailed you.

    | _Z_
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | zAutos
    0

  • I've a follow on question to this: I have a client that's setting up a section of his site in a different language, and we're planning to geo-target those pages to that country. I have suggested a sub-folder solution as it's the most cost effective solution, and it will allow domain authority to flow into those pages. His developer is indicating that they can only set this up as a sub-domain, for technical reasons, but they're suggesting they can rewrite the url's to appear as sub folder pages. I'm wondering how this will work in terms of geo-targeting in Google Webmaster Tools. Do I geo-target the sub domain or the sub folder i.e. does Google only see urls or does it physically see those pages on the sub-domain? It seems like it might be a messy solution. Would it be a better idea just to forget about the rewrites and live with the site being a sub domain? Thanks,

    | Leighm
    0

  • I have had good results by using the rel="canonical" tag on the satellite pages.  If you have access to the satellite pages then copy the content onto the new domain.  Place the canonical tag pointing to the page on the new domain.  I have also found that this will give the new domain a huge boost in initial rankings.

    | LabadieAuto
    0

  • @Ryan Kent, are you suggessting that it is okey for the site to be a complete duplicate as long as it's set up for a specific country (in this case Ireland) yes

    | RyanKent
    0

  • I got it.... I am going to implement as previous one. Thanks for your prompt reply.

    | CommercePundit
    0

  • Hi John Thank you for your reply. It's kind of what I was thinking - if it feels grey or iffy....it probably is! The reasoning was that it is a custom system with a list of car manufacturers, with outbound links to their websites. That's 34 external links on every page. The client didn't develop the system so I'm trying to work with it and optimise as best I can, without needing them to incur development costs. Thanks again for your reply. Trevor

    | TrevorJones
    0

  • You can, but then the content will be removed from Google's index for 90 days. I am not sure what effect this would have on pages with the images. It shouldn't have any effect, but I would hate for you to have rankings in any way affected for 90 days. I have no experience in having images indexed in this manner. Perhaps someone else has more knowledge to share on this topic.

    | RyanKent
    0

  • Hey Techboy - Assuming your site validates fine using the Rich Snippets tool,  as you said, unfortunately I don't think there is much you can do. I heard Stefan Weitz from Bing talk about it, and he said that it's a slow rollout because they want to get it right, and they are also wanting to see how people use it. Also, the search engines are giving priority to brands and well-known people (especially with rel=author markup), so the little guys are having a harder and harder time getting the semantic markup to show in the SERPs. He even went so far as to say that we should mark up our sites now, so that when Schema is rolled out more we'll be ahead of the curve (and he insinuated that it will affect rankings positively as well). Sorry I can't provide an actionable answer, but right now with the semantic markup it can be a bit of a waiting game.

    | dohertyjf
    0
  • This topic is deleted!

    | JoosE
    0

  • Since .com domains are assumed to be targeting the United States by default, you don't need to do anything to target American searchers beyond having a site that is in English and is linked to from other English websites. There are a few reasons you'd ban visitors based on IP/location, but none of them have to do with improving rankings. These include: You have different a different site for international locales (you would want a 301 redirect, not an IP ban) You want to prevent forum/blog spam (Cloudflare would be a much better alternative than doing this manually) Legal reasons people from a specific location shouldn't be accessing your site or page - the only legitimate reason I can think of to ban large blocks of IPs

    | KyleJohnson40
    1

  • I have observed some strange switches between home page and landing page on our site which kind of goes towards your theory. However I am not convinced that Google consolidates PageRank in any other way then simply by following links and mathematically assigning it throughout the site. All duplicate pages flow PageRank and pass it to the rest of the site in the same way they receive it. Sometimes Google will block a page which is seen as duplicate. That page will not show or pass any PageRank or anchor text value to other pages (dead end).

    | Dan-Petrovic
    0

  • I have seen one interesting solution for this in retail/ecommerce industry. Knowing that iPhone will evolve in different versions webmasters name their pages: domain.com/iphone (not domain.com/iphone4) The version is contained in changeable elements such as title tag, meta description and content. Another thing you can do is merge various page of similar or same content into one canonical version (using rel="canonical"). When pages expire it would be a good idea to take users to the next best piece of content - to maximise SEO value you would redirect them using 301 if the change is permanent or 302 if you anticipate the job to come back at some point in time. By recycling the same URL after job re-appears you will be preventing creation of disposable URLs and runa  cleaner site. Any old links pointing to those pages will not be going to a 404 page but to the old URL thus capturing the much needed link juice in the correct place.

    | Dan-Petrovic
    0

  • Hi Micelleh, Ryan has given you a great answer here. There is also a really detailed post about getting into Google News from Cyrus Shepard in the SEOmoz Blog Sha

    | ShaMenz
    0

  • The issue is the repetition of words more than anything.  There's no justification or rationalization that can be used to say "this long URL is valid from a readability or a page topical focus perspective.  In fact, it can both make the site look untrustworthy to some users, and potentially cause search engines to flag the page as "over" optimized - going too far with keyword repetition is definitely something that can cause a page to lose some of it's ranking value.

    | AlanBleiweiss
    0