Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • No problem at all. Glad I could help. I think you've got it under control. I tend to over-think things a little bit after a long night, haha. Last two cents... There's a Magento extension I use on one of my stores that's very similar to what you're thinking of. On the checkout page, the very first form requires the user to enter the billing info (just name, email, address, etc... not payment info), as well as password, and has a checkbox that asks the user if they would like to "Register for Future Convenience." Above this first form there's a simple a link at the top that says "Already Registered? Login here." which replaces the billing info form with a login form if clicked. I think something like that would work perfectly for your situation, you'd just need the addition of a password field to the billing info section, and a link that replaces the billing section with a login form when clicked. Depending on which method/form is displayed your button text would either be "Login and Continue" or "Register and Continue." For new users who need to register, the only additional step as opposed to a "Guest Checkout" would be filling out a password field. Good luck man. -Anthony

    | Anthony_NorthSEO
    0
  • This topic is deleted!

    0

  • Its a glitch. Google knows about it. Been happening on all sites we maintain for about 4 days now. Heard this morning that they are aware of it and it will be fixed soon. C

    | CarlosFernandes
    0

  • Well when I did it I put one removal request in for the whole domain and also put a disallow in the robots.txt for the whole site. Matt appears to be referring to putting in to many removal requests, but if you want your whole site removing you only need one so this wouldn't be an issue - you put your domain URL in. When you say your page has no snippet have you checked what your meta description is as this can help influence your snippet text. I would work at getting your development site removed a.s.a.p and then seeing what happens with your snippet - I think that there is a good chance it could be down to duplicate content issues. Have you checked what the cache for your homepage is in Googles results?

    | Matt-Williamson
    0

  • Do you mean that you are copying your competition and putting links to your site on forums and on blog sites?If this is the case then I think the links are of little value and can actually lead to penalties in the SERPs. Please see this response to a similar question regarding these types of link: http://www.seomoz.org/q/blog-comments-good-for-seo Also see this Q&A for sensible link building advice: http://www.seomoz.org/q/drowning-in-a-sea-of-link-building-advice

    | Matt-Williamson
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | 2uinc
    0

  • I track a lot of websites, not carefully but from time to time check positions. Some of them are active clients, some of them I worked for before and I must say that this update seems as "ok done" with not many mistakes. In general, around 50% of sites are on the same positions. Around 40% had slight improvements and 10% lost positions. What I realized, and would like to know if you have same experiences is that this update was something like on "keyword based" level. On 5-6 websites which lost positions it looks strange because some #1 positions stayed at #1 and some dropped like from #1-#5 to #120-#200. One website lost 60% of visits and it was my test website. I have 31 pages on that website and all of them looks like "word1 word2 different word" so it if for sure spam looking website. All the 30-40 keywords were in top20 and arond 10 #1. So it is "strange" that a lot of high ranked keywords went to #120+ positions but something like 5-6 keywords still #1. Another thing, if I am not wrong, is that google focused also on the websites owned by same person who owns same niche websites. I have something like a partner with around 10 dental sites. 6-7 of them stayed pretty the same and 3-4 had slight drops, one had major drops. what is important is that all the content is unique, not spammy on all of them and links were pretty the same. I mean, he made links sometimes using the same source (rubbish) but a big number of links not in any connection so like 70% of links per site were "unique" sources. But again, in his case drops were like "per keyword", some keywords still on #1 and same website lost other keywords from top10 to #120+ Did you have same experiences maybe ?

    | m2webs
    0

  • Hi Andy, I've just replied to your PM. Hope it helps!

    | MiriamEllis
    0

  • The best place to get started is by reading up on link building posts here on SEOMoz: http://www.seomoz.org/pages/search_results?q=link+building From there, I'd check out the Link Building Basics section in the Beginner's Guide to SEO. That should keep you busy for quite awhile, but once that runs out, check out Point Blank SEO's list of link building strategies. You'll rarely run out of ideas there.

    | KaneJamison
    0

  • Hi Marty, I can say that we have done that. There is a single line of code that you can add to the main website. It detects the browser and opens the mobile version of the website if accessed via mobile.Unfortunately, it does it for iPads too as it treats them as mobile devices; apart from that there are no negative implications. I have no idea about #4 but it makes marketing sense to mention your mobile website, if any on your home page.

    | Sangeeta
    0

  • The best way to prepare for Google's house cleaning updates is to always try and stay as white hat as possible. If can try to avoid shortcuts and obvious shady techniques you will most likely BENEFIT from any such updates. Think from Google's perspective, they want only the most legitimate and original content sites, so if that is you, have no fear.

    | TheGrid
    0

  • You might want to elaborate on why you have pages of duplicate content. I'm not sure I understand that point. Normally I'd say let your list be indexed and canonical the products. In fact, I'm not sure if a canonical on a list would help all that much. Nofollow links don't pass PR. They are still spidered and ranked. A robots would work best to exclude an entire section of your website. And, yes, Webmaster tools will complain because you've cordoned off a large section of your site. I seriously hope you have some unique content on your site somewhere because it sounds like you just de-indexed all your pages!

    | Highland
    0

  • Sorry I do not have much. Basically I believe Google is running all kind of different tests on each database, collecting data then using it on major updates. One possible way to check rankings for different databases is to connect through Proxy Servers, but I am not a big IT guy, so maybe someone else can answer you how to do that. Istvan

    | Keszi
    0

  • Could very well be, Google has stated that they penalized m,any sites because the wrongly thought they were parked http://searchengineland.com/dropped-in-rankings-google-mistake-over-parked-domains-118979?utm_source=feedburner&utm_medium=feed&utm_campaign=feed-main

    | AlanMosley
    0
  • This topic is deleted!

    0

  • You have some valid points to consider... things seem to be improving and the articles that you might cut do pull in some traffic. I can't tell you how to make your decision but here is how I made mine.. I had hundreds of republished articles but a lot more that I had written myself.  Deleting lots of republished articles would cut my traffic and cut my income.  Noindexing them would cut my traffic and cut my income.  However, although those were serious losses they were small in comparison to other content on my site.  So, knowing that google does not like duplicate content I got rid of them.  There is still lots of great content on my site, visitors still find stuff to read, I know which of the things that I cut I should author a customized version for my own site. The upside....  My site is more compact but still has thousands of content pages and the content that remains should be a lot stronger.  After making the cuts my rankings, income and traffic increased.  Not quite to previous levels but back to nice numbers. I have reduced risk and am pleased with that.  Everything that I cut was redirected to similar content.  The most valuable of what was cut will be replaced with custom content with 301 redirects from the old content. ============================ How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes. What from this is unique?  Definitely keep that.  Keep what is not struggling in Google. Keep what is essential to your site but replace with better that you create yourself. Do you see further risk in future panda updates? Yep... that's why I cut off my foot. My thoughts are to rel=author each of our own articles, YES... In the past I wanted all of my content to be anonymously written.  I have changed my mind on that and used rel=author on the best stuff. no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles. heh....   Here I would be chopping off two of those sites and merging them into one.  I would have done that years ago before panda was ever heard of. I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops. I agree.

    | EGOL
    0

  • This post might also be helpful. http://www.seomoz.org/blog/the-noob-guide-to-online-marketing-with-giant-infographic-11928

    | KeriMorgret
    0