Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Do you see links reported in GWMT for both versions? In our case there are no links reported for the non-www site. (www is canonical). If you see links on both sites, I would disavow both just to be safe.

    | Chris661
    0

  • What I mean by one-to-one redirects is redirecting each page on one domain (the ones with incoming traffic or links) to the equivalent page on the other domain.  They are standard redirects. Ex:  fullcompanyname.com/category/product1 to compname.com/cat/product1 Hope that helps.

    | BeanstalkIM
    0

  • Thanks for the thanks, Patrick G. An amusing sidelight: one company that refused my request for company email had previously  entrusted me with use of their corporate credit card -- and continued to do so after refusing my request. Go figure. (sigh)

    | DanielFreedman
    0

  • Just looking at the anchor text in your back link profile alone gives clues to why it's hard to rank - http://moz.com/researchtools/ose/anchors?page=1&site=http%3A%2F%2Fwww.mesocare.org%2F - that is, the anchors are mainly commercial keywords that one would try to rank for - this is exactly the site of link building that is not working as much, and will even get you penalized in extreme cases. You have links on such pages as this: http://blogs.creighton.edu/klb89788/2012/08/31/hello-world/ and this http://wsn.eecs.berkeley.edu/?p=56 etc etc - which are not going to help and will likely only hurt the site in rankings. Unfortunately this will take quite a bit of effort and work to overcome. You'll really have to show Google over time (6-12 months at least) some more quality signals - not only links but user metrics etc as well.

    | evolvingSEO
    0

  • Thanks!  I downloaded the paid version of JoomSEF... the free version includes a link in the footer... kinda tacky for a client's website.  The paid version was only $35.57 USD.  So... crossing fingers.. here goes nothing!

    | Laurean
    0

  • Hi Joe, There haven't been any major image algorithm updates since the beginning of August that I'm aware of. Have you looked at the queries that you rank on image search for to see where specifically you're dropping? There could be another site that popped up in early August that's outranking you. Best, Kristina

    | KristinaKledzik
    0

  • Hi Ray, Thanks for responding, I don't know my programming too well, but from the way you are saying it, it seems like this is a standard process. I will send this to my developers, thanks for your help!

    | WSteven
    0

  • Completely agree with every point Tom made here.

    | Bryan_Loconto
    0

  • Hi there, The good news is that you're not the only one! As of this post from Woorank in Google Product Forums, their links are apparently no-followed and you can contact them to have them manually removed: All of our links have nofollow attributes, so it is odd that you would get so many showing in your Google Webmaster tools. We'll certainly be checking into this further. In the meanwhile, please send an email to support(at)woorank(dot)com with your website address and a request to remove the link and we will be happy to comply. Source It looks like other Mozzers have had this issue too. Apparently Woorank do have a process where you can get the links removed, so I would try that first. I also have a suspicion that as so many webmasters are reporting a large amount of links from Woorank, Google is _hopefully _ignoring them. John Mu confirmed that they ignore UpDowner, so I wouldn't be surprised to hear they ignore Woorank too. Hope this helps.

    | ecommercebc
    0

  • Hello Bruce, It was about promoting our ecommerce services - and all the different areas of that really... Thanks William

    | wseabrook
    0

  • You may find these links helpful- http://moz.com/blog/10-things-relaunch-your-website http://www.seoandy.net/optimisation/relaunch-seo/ http://searchenginewatch.com/article/2070968/SEO-Website-Redesign-Relaunching-Without-Losing-Sleep Thanks

    | sachin-sv
    0

  • Thanks Sean I will contact the help team and see if they can be of assistance

    | chrissmithps
    0

  • They'll be easily visible to a crawler. They'll be extracted based on their HTML/CSS elements, which are additional signals of importance and of website structure. Don't worry about it. If the HTML of the page includes the raw text, a crawler will understand it.

    | alecfwilson
    0

  • How long ago did you add the redirects? As with many SEO changes, you'll need to give Google time to recrawl and reindex your site. This can take several weeks in some cases, and even then, it will sometimes take multiple recrawls for Google to entirely index the changes. This assumes everything else was done correctly and nothing else is causing the duplicate content warnings.

    | alecfwilson
    0

  • Great question, and great answers from some of the other commenters. I've struggled with this question myself in building landing pages. The 20% rule is a good one, and makes sense, especially as Google gets better at semantic search and "keywords" become a bit less important in favor of query meaning. In a perfect world (one where search engines could understand queries the way your friend would when you told him what you searched for), if you cannot come up with 20% of a landing page that is entirely unique to that page, it's not something you should be building a landing page for. In the world we operate in, it's a nice guideline. My method for long tail landing page creation is: figure out what the head keyword that this long tail landing page is most related to (if you are trying to reuse the same value prop), and just rewrite every sentence. You should alter your word choice, sentence structure, and page organization (it's a nice opportunity to test those things as well, a long tail page that does unexpectedly well may give you some insight into a better converting format). At this point, I add the unique content. For keywords that aren't different enough to have true unique content, I'll generally write a section summarizing a few of the others all together, or add a different customer testimonial. To the commenter who mentioned that you can create unique content to search engines, but humans would laugh - a landing page for long tail keywords really shouldn't be something a customer can get to without coming to it from an external referrer. The root domain shouldn't link out to both domain.com/landing-page-head-kw and domain.com/landing-page-long-tail-kw.

    | alecfwilson
    0

  • Is this still happening? I crawled the URL you listed with tool an old friend built that extracts HTML/CSS elements and reads the page like a search engine would (he's worked on search engines professionally for years), and it extracted the correct title. As referenced above, Google's structred data tester is also showing no errors (http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.freestyleaccounting.com%2Faccountancy-services%2Fsetting-up-a-limited-company-for-contracting-quick-start-guide%2F).  A screenshot of the query and result would be really helpful.

    | alecfwilson
    0

  • Hi, I haven't actually done this myself, but the best way to handle it is after the move and the www-version of the site has been verified in GWT, just check to see if the disavow file exists there. If it doesn't just re-add it. The fact that you are redirecting from a non-www to www, means that your disavow document will still be fine to use, as it applies to the new site too. -Andy

    | Andy.Drinkwater
    0

  • Just fyi and for anyone who might be interested: That was the solution! I put the rel=canonical tag in the homepage header and my duplicate page content problem was gone! Thanks!!!

    | momof4
    0

  • Thank you everyone for your responses! The link you sent of the cached pages LynnP was also helpful. As soon as my co-worker who administers the server gets in I'm going to mention to him that we check the subfolders for anything fishy. I know for a fact he looked for subfolders that were suspicious but I'm not sure he may have thought to check the existing folders for sneaky things. Most passwords have been changed... but I will double check. Again, thanks everyone for your help, very useful!

    | mshowells
    0

  • Yes, that's generally considered the correct way of doing it if you have links to both www and non-www.

    | Chris.Menke
    0