Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I know the feeling and concerns, Marshall.  Glad I found the question here!

    | AlanBleiweiss
    0

  • In an ideal situation, the method where you just code the variable into the canonical tag string can work, however it can also lead to problems if the coding framework is not properly planned out or if an upgrade to the system has a bug in it.  I've seen situations where page_uri doesn't function after a system upgrade. Best practices dictate that you need to ensure the full non-appended absolute URI show up at the source level on a live page so testing is critical if you're going with that option, and be aware of potential unanticipated breakdowns. It's all about what is seen on a view-source or seen the way Googlebot sees it.

    | AlanBleiweiss
    0

  • I am more of a layperson, and joined this site to research certain topics. I wrote a blog post on the topic of RSS feed scraping and content theft .. which if I understand it was not exactly the OP question. Having read the other answers here, however, I am actually wondering if my blog post is inaccurate ... or incomplete, and needs correction. Are you all saying that there is no harm in your RSS feed being scraped, and it might actually be helpful due to the backlinks you might get?  Or are you saying that Google ignores those links as it is clear they come from an RSS feed? Or, am I misunderstanding your point entirely? THanks for clarifyingl  Here is the post if anyone wants to scan it and respond. http://info.icopyright.com/discovery-copyright-infringement-detection/blog-content-theft-protect-your-blog-from-rss-feed-scrapers Thanks PS if I have misunderstood the protocol, and am not to piggyback on someone elses topic, or add a link, please advise.  My first foray into this forum

    | rhondahurwitz
    0

  • Hi Jason, Thanks for the definition. I recommend that you utilize Mike Blumenthals Local Business Center Category Tool (http://blumenthals.com/index.php?Google_LBC_Categories) and you will start to see why you are going to have difficulty marketing this business in a traditional local way. The term 'pregnancy care center' is associated with the following synonyms: Pregnancy Care Center abortion, clinics, ob gyn, obstetrics So, if the business chooses to market itself through typical local channels (Like Google Places, Yahoo Local, Bing Places for Business, Yelp, etc.) there is a very high chance that your core term, Pregnancy Center, is going to be mistakenly associated with abortion clinics, which is clearly not what your client wants. I'm not sure this is avoidable, however the client chooses to market itself. Even if the website is 100% clear regarding what services are and are not offered at the center, people often fail to read website well and the center may end up fielding calls for services not offered. Of course, this is a risk factor for literally any type of business model. An auto body shop may get calls for muffler repair, even if they don't offer it, right? If you are the marketer for the project, I believe your task will be to explain the risk/benefits scenario to the client. There is going to be a built-in grey area for the company. Some people may see their business on or offline and think it is an abortion clinic, a right wing organization, a birthing center, etc. I don't see a way around this and it is likely something the center is already dealing with. So, basically, they need to make an informed decision as to the amount of publicity they wish to earn, as this will likely be commensurate with the amount of traffic/phone calls, etc., they are receiving for services they don't offer, while at the same time, bringing them new business from the right types of clients.

    | MiriamEllis
    0

  • In his case - he wants to get rid of some duplicate content only. I see what you mean but if he is not in the situation listed in http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1269119 then it might be the best bet / fastest bet. For me personally it worked so far very well - if no robots.txt is used as that won't help on the long run as the removal tool has an expiration date of several months. The down side of the removal tools is the same expiration date - as if you change your mind you will have some issues getting the page sinto the index.

    | eyepaq
    0

  • Hi, If it's 3-4-5 pages that you want to create is fine -as you can manually craft those to be somehow different - but if you are going over board it will be seen as duplicate content or spamy con tent and you just won't rank. If you really go over board and deploy a gazilion of city pages then you might even go into issues site wide... Cheers.

    | eyepaq
    0

  • Google Can and will index pages, even after nofollow links are added. It might me that a different domain is linking to that page thus the nofollow is rendered useless. The yoast plugin adds a noindex.  The noindex is only applied after recrawling plus some day's. you can check this in google by looking "in cache" and see the date. Even so, google still can and sometimes will index it. (eg if you do noindex on you whole domain, google will still hold you homepage in google for a long time) Google makes a decision based on their parameters.

    | Stramark
    1

  • Thanks.... was starting to wonder whether I needed a space or something... I have this without a space: with a space: Shouldnt make much difference but as it's taking time for Google to notice just curious...

    | bjs2010
    1

  • You have a meta tag in your header "noindex, nofollow"  which tells search engines, don't index this home page, and don't crawl the site.

    | AlanBleiweiss
    1

  • Hi Michael, It is as much about domain, nameservers etc. as it is about developing localised content and also building backlinking properties from the country in question. Try and think of all the things you would do if you were setting up a local London business i.e. .co.uk domain, London directory listings, content relevant for the audience etc. Hope this gives you a good starting point.

    | MichaelYork
    0

  • Honestly, most people just ignore it. But I'll leave the discussion open for awhile in case someone has some other solution/s to the issue and found them worth pursuing.

    | Everett
    0

  • Hey Simon, The Australia is lang="en-au" The UK is lang="en-gb" The US is lang="en-us" We've tried to keep these as tight per country as possible so opted not to use the straight 'en'. In analytics, there has been some reduction is language referrals, mainly "en-gb" falling from the number one language type for the US site, which is a positive. Interstingly enough, once we removed the .co.nz fro mthe Index the .com site remove in to dominate the SERPs for brand and some core-KW searches in Google.co.nz. Its a little unfortunate as Panda, from my understanding, is keen to spare ccTLDs from any harsh devaluations, but we'll hopefully be able to hit whatever threshold for % of unique content in the near future. We have review functionality planned for each TLD which should help add value to existing duplicate content. Once this is up and I have some more robust data I'll pull a post together for YouMoz. Thanks for the feedback! Kian

    | team_tic
    1

  • Just to clarify, let me be more specific about what I initially planned to do: Let's say my current website is at: example.com. Rel canonical is set so that if a user lands on www.example.com, they're redirect to the non www version. For some reason, the negative seo targeted the www.example.com instead of the main url. So I'm thinking since all the bad links are pointing to the www version, redirecting that to somewhere else may help. What you say makes sense, that google can detect that both versions are from the same domain. I didn't see any major drops from the penguin last night. The main drop started middle of March and continued through April. This seems to have been an algo penalty vs. manual penalty as I didn't receive a warning. The entire homepage wouldn't redirect, just the www domain. I doubt anyone would be affected by this since the rel canonical is set to the non www. I think I agree with Chris on this and just wait it out and see if the disavow request goes anywhere.

    | howardd
    0

  • Thanks for the input Mat and Michael. Unfortunately it is not a question of necessity or vanity now...too late! We have 301 redirects in so, fingers crossed, we will come through ok.

    | franchisesolutions
    0

  • Press Releases are meant to get picked up by lots of media outlets.  That's the point.  For example, if you submit to PR Web, your press release will end up on hundreds of sites.  I know from experience that it does not hurt you as I do daily for my clients and my own sites/ companies.  It helps as long as they are optimized well and have high quality content that people will want to read.  Google et al will select the "best" one(s) to keep in th SERPs and the rest will drift away.  Even though those pages with the press release still exist, they have no impact positively or negatively for you, and you won't get a duplicate content penalty as the search engines knows how to treat News.  I agree with Chris that you must understand where they fit into your overall strategy.  They are only one small piece of the process.  If you build a strategy relying on press releases, you are going to be losing that treadmill battle.

    | RepLoc_Tim
    0

  • Yes I can see some changes in Google in Australia, some strange results though!

    | Karen_Dauncey
    0

  • I have seen any changes in rankings related with penguin 2.0

    | maestrosonrisas
    0