Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Killer question.  Once I had a coder do a quick mod to have fixed/ static block of content and show all the posts under it and it worked. I didn't carry that project too long and didn't try other alternatives - would love to know if there are better ways to go about it too!

    | Syed1
    1
  • This topic is deleted!

    | others
    0

  • I can't remember where I saw the data, but the benefits of a link drop dramatically after the first few.  So, spending all that time building links to a secondary blog that will only give you good juice on probably two links seems a waste of time.  You would be better off buying a well established blog, drop a couple links to yourself, and continue to utilize it for its intended purpose. If it were me, I would put that quality content with the products themselves, and work on getting them ranked.  With a well designed page you should be able to convert these people for the phrases they searched.  If additional help is needed, link out to it from this page, and not as a separate blog.  It will benefit from the very relevant link on your own site, and you can continue to build links to it if its a popular search phrase.

    | ResslerMotors
    0

  • See the problem is that its not really a subcategory, its a filter that displays filtered products from the given parent "category". all categories in our system have this filters. There is no place we can edit the data. As its not really a real category. If that has been the case we would of course have made UNIQUE descriptions.

    | areygie
    0

  • From Chris Silversmith on Twitter: Google Places tries to normalize addresses to display. Suite #s often not necessry to find locations so they often leave out. https://twitter.com/#!/si1very/status/183276965158060033

    | JoeYoungblood
    0

  • There has not been a PR update in the past few days (as far as I know). Therefore it is doubtful that your browser PR return is steered by G, but something else is overtaking your stats. Your site is hosted on a shared domain. If something happens to any of those sites, your site would/could see an immediate effect. I would move away from sharing the same URL and go for hosting on a dedicated server. Good luck!

    | Discountvc
    0

  • Hi Hanna, I have a quick question. The language function should be added to every URL or just the home? Thanks, Guido.

    | SilbertAd
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | ske11
    0

  • There appears to be a fundamental issue with the eay your search results link to your products. Is this store based on Cubecart. I had a similar issue a while ago whith CC and I managed to fix it by changing the code on the search results page to build the full path to the product rather than the shorter URL without the category folders. One way would be to ensure that products carry a canonical tag pointing to the actual page URL? Checkout the guide below to Canonical Tags. http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps Though I would recommend fixing the issue at the root and modifying the way the search links to products.

    | Aran_Smithson
    0

  • Dan Sorry for the delay, just wanted to say thanks for you incite to the website and yes I prefer you honesty and feeback. shivun

    | seohive-222720
    0
  • This topic is deleted!

    | elgoog
    0

  • Hello Alan, If they are only doing this for Google then it is indeed cloaking, regardless of their intent. That may or may not land them in hot water, but I think Google has provided plenty of other ways to handle this situation. First and foremost, it would be best if those session IDs and certain other parameters were put into a cookie instead of the URL. As you probably know, you can tell Google and Bing how to handle the different parameters with their free webmaster tools so it would seem to me like that would be the best approach if you cannot get rid of the parameters all-together. You can also put a rel canonical tag in the header that references the version of the URL without the parameters. Force redirecting Googlebot is not a good idea. If, for instance, you wanted to force redirect by IP - as is the case with many global sites that have geo-specific landing pages - then that would be fine as long as you aren't making any special exceptions for Googlebot. Did this answer your question? Good luck! Everett

    | Everett
    0

  • Hello Ocelot, I am assuming you have a site that has affiliate links and you want to keep Google from crawling those affiliate links. If I am wrong, please let me know. Going forward with that assumption then... That is one way to do it. So perhaps you first send all of those links through a redirect via a folder called /out/ or /links/ or whatever, and you have blocked that folder in the robots.txt file. Correct? If so, this is how many affiliate sites handle the situation. I would not rely on rel nofollow alone, though I would use that in addition to the robots.txt block. There are many other ways to handle this. For instance, you could make all affilaite links javascript links instead of href links. Then you could put the javascript into a folder called /js/ or something like that, and block that in the robots.txt file. This works less and less now that Google Preview Bot seems to be ignoring the disallow statement in those situations. You could make it all the same URL with a unique identifyer of some sort that tells your database where to redirect the click. For example: www.yoursite.com/outlink/mylink#123 or www.yoursite.com/mylink?link-id=123 In which case you could then block /mylink in the robots.txt file and tell Google to ignore the link-ID parameter via Webmaster Tools. As you can see, there is more than one way to skin this cat. The problem is always going to be doing it without looking like you're trying to "fool" Google - because they WILL catch up with any tactic like that eventually. Good luck! Everett

    | Everett
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | SirSud
    0

  • You can find out by utilizing Google page speed online here: https://developers.google.com/pagespeed/

    | SEOExecutive20
    0

  • As far as I can tell is those are just plain Google Images in a universal search result. If you search [model warship combat hulls] in Google you'll see a similar result. I know most of those image results, and can tell you they are certainly not in Google Merchant. In fact, the first result image (of just a white hull with a pink interior) is from my site, and we're not in Google Merchant. We have made a point to make our file names descriptive, use alt text, etc, but I know that many of the results for similar queries have not even optimized this much.

    | KeriMorgret
    0

  • Hey Sebastian - We already do something similar to know if it is expired (instead of the if condition in MySQL, we query for records where job_closing_date >= CURDATE()). Thankfully they programmed that in to pull the old job off the list and out of the job search results. (Though up until yesterday the old jobs were on the XML sitemap...woops. Guess what I fixed yesterday!) I like your idea though of keeping the content active and keeping the page alive, but with some kind of message above there. That would definitely keep the page unique. I'm not positive that will fly on the business side but I'll definitely propose that. Thanks for the reply!

    | Matthew_Edgar
    1

  • Here are some simple first step that will at the very least fix any of the underlying issues that were addressed by Google. 1. Contact the past SEO Company directly, explain to them the warnings from Google and see if they will give you that info.  You never know as it never hurts to try.  Many SEO companies would like to keep there reputation even after being dismissed by a client. 2.What If that doesn't work? Contact the site owners per each site and explain to them that you have been hired to address SEO warning by Google caused by a  prior seo company and you would like to resolve it by requesting them to delete the articles. This is actually a win/win because I don't know any website owner that really wants garbage on there site. 3. I don't see why there would not be an opportunity to increase there rankings again.  I would be upfront with them and explain to them that its pretty much like starting from scratch and that past rankings are irreverent.  Send them the release about the JCPENNY incident so they can familiarize themselves with the situation that they have encountered. http://www.pcmag.com/article2/0,2817,2380306,00.asp 4. Once you've solved these issues, do what you do best and optimize there off-page SEO, since you said there on-page is pretty good. You said they want results yesterday.  Don't give in to there pressure by making any false promises.  Just be straight forward.  It's better to lose a client with a piece of mind then make them false promises that will end up haunting you later.

    | SEOExecutive20
    0