Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I hear ya! I find one of the biggest issues ecommerce sites have with rankings has to do with their data. That and a lack of canonicalization to help Google figure out which are the money pages.

    | DonnaDuncan
    0

  • To add to this, Google have a good primer: https://support.google.com/webmasters/answer/35291?hl=en There is some good info e.g. "No one can guarantee a #1 ranking on Google".

    | Alex-Harford
    0

  • If you are going to move it, do it soon. Also compile a list of places where you have already gotten the links and message the site owners to ask them to update their links.  The longer you wait, the more of a pain this will be.  Also, is your other site as sound for on-page optimization as the current one?  If there are issues with title tags or h1 issues or any number of items, then it may not be as successful on the site where you want it to be.

    | TheeDigital
    0

  • Good advice, thank you. Luckily I know some of those other things better than SEO, so that works out well. Cheers, Wes

    | wrconard
    0

  • The guys all have some great thoughts here. One more thought: If a promotion or product is usually short-lived but comes back intermittently (e.g. once a year, twice a year, etc.), the URL can be kept live. A good example is rental real estate in a big city - the properties will come back on the market, but the URLs need to show visitors that the apartment is not available at the moment. RIghtmove (real estate in the UK) does this with rentals it knows will come back on the market, e.g. http://www.rightmove.co.uk/property-to-rent/property-21907128.html They re-use the pages when the property is live again - I've seen them do it with both flats I lived in, and they rank remarkably well. Questionable usability if a page ranks and is actually unavailable, but effective. Clearly this is only relevant if you will re-use / open these short-term promotions in the future.

    | JaneCopland
    0

  • The SEO of the site is probably fine.  The problem with the site is that it takes one page of content and smears it across dozens of thin content, duplicate content, cookie cutter pages.  The SEO is lipstick on a pig.

    | EGOL
    0

  • That would point the most authority at the home page, but isn't the best user experience. Our concern is that Google monitors how people use 301s in terms of how it affects user experience and whether the redirection was done purely for SEO purposes. However, if you can do that without confusion, it would be a good option.

    | JaneCopland
    0

  • Since the onclick event is a Javascript issue it's to all intents and purposes ignored by the search engines. What counts is whether or not your link appears in the href attribute, and whether or not the link is nofollowed. If you do not have it set to nofollow, there is a slight chance they would follow it, but as stated above, Google is pretty good about parsing javascript links.

    | David-Kley
    0

  • A 401 is 'unauthorized' - is that the code it would produce, or a different error (or a typo!)? That could work in theory - I'd be a bit hesitant about the extra step involved in 301ing to get to an error page on a different site. In general, the fewer steps you make Google go through, the better. This method would mean that your new site should not be "credited" with the bad links, however.

    | JaneCopland
    0

  • Hi Aaron, Glad you sorted out a fix for now - I couldn't find a bot name either (and people who are simply thieving aren't likely to obey robots.txt either). Let us know if we can help more. Cheers, Jane

    | JaneCopland
    0

  • First of all, I'd like a little more information about the "legal reasons" that are forcing to you create ccTLDs for Canada and the UK. What you're doing now is actually ideal for international SEO when it comes to the same language, so I wouldn't want to change things unless it's really necessary. Is it because the EU has stricter laws regarding privacy? In that case, I'd recommend following the strictest laws for the entire .com site and leaving it at that. Is it because your company has to offer different legal information from country to country? If that's true, I'd post all legal information on my site, for all countries, and let visitors look at the pieces that are most relevant. Is it because you're starting to sell different products? In that case, yes, you probably need a new ccTLD since you technically have different sites, but that means that you can't really use hreflang, since your product pages will be different. If it's something I didn't list, and you absolutely have to create new ccTLDs, the best recommendation is to use hreflang. You can either use it in the of your page, like you said, or in the XML sitemap so the extra code doesn't have to be loaded with each page. Hreflangs work a little like canonicals, so Google can choose to pay attention to the international versions of the site or ignore them, but eventually yes, your UK and Canadian sites should rank as well as your .com site. There will definitely be a dip in traffic as Google figures things out, though. Hope this helps, and good luck!

    | KristinaKledzik
    0

  • I would recommend option #1. It's common for sites without SSL certs not to resolve properly at the HTTPS version of their URLs, and Google handles this fine. You could pull the log files and take a look at how often Googlebot / other users request HTTPS versions of that site A's URLs, to determine if that SSL/redirect set up is necessary. But I would not anticipate any significant negative impact on traffic letting the HTTPS version of site A kick a 404 or server error.

    | MikeTek
    0

  • I am actually experiencing how those URL parameters work - what I can tell you from this point is that you have to name those parameters more exactly - if you want to exclude the attribute "pgid", "CategoryName" etc. then that`s it what you have to use. You cant use "+", "-", "/" etc. because Google ignores that... an example: if somebody is running an ecommerce store with intershop then the system generates attributes like ";pgid" ... Google cant work with ";" so you can`t exclude this attribute using GWMT As far as I know, Google just can handle the parameter "?"... So whatever comes after your "+" take the next word, to try to exclude those URL`s anyway... You have to be patient: this will take some time to see the effect (if it works!)

    | dotfly
    0

  • Just wanted to add, no I don't think there would be any adverse effects on ranking, unless the site was compromised somehow. Since you are on a completely different system, you should be fine with a resubmission. On a side note, since you are on WordPress now, make sure you have the right file permissions, and shell access turned off. Hackers love a WordPress site that is unprepared. Best of luck with the new site!

    | David-Kley
    0

  • If you stick the following meta tag in the of a page, Google and Bing won't show cached versions that page on search results pages: It shouldn't have any impact on SEO.  It only means that archived versions of pages won't appear be provided by the search engines on search result pages. As far as reasons to do this, we use it on pages of our site where the cached page would be blank, because the page contents are loaded by AJAX, and the cached versions wouldn't load that content.

    | john4math
    0

  • Personally, I'd leave them in place. I don't think it does any harm to have the bread crumbs in place. Now if it was creating additional URL's I would say you may have an issue but I don't think you have anything to worry about. If anything it could provide a benefit

    | David-Kley
    0

  • Hi Dan, Hopefully your read-through of Google's guidelines will have helped to clear this up, too.

    | MiriamEllis
    0