Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Hi Anshul.S, sorry to say that, but there is no way you'll ever get the exact same numbers in Salesforce vs GA. That's because Ga is tracking Goals based on JS which could be triggered several times even without a purchase. To give you some examples: an user could be refreshing the thank you page and you'll be getting multiple visits you may have a purchase of 100$, but then the person doesn't pay or cancel the order, it own't be real money. You are correct about integrating GA and salesforce, but you have to use salesforce to validate GA data which is mostly interesting for detailed analytical purposes not revenue or business ones, where the most reliable sourc eis your own DB. Regarding the source, I did this in the past, you want to ensure that you create your cookie in the same way Google is doing it. Consider that Google changes the source of traffic in case the session times out (more than 30mins) or if there is a change of source, ex. you serch in google enter the website, leave, then enter again using direct. Also ensure you're using the same attribution model Google uses. Last click, first click, etc etc Ensure that you're correclt carrying over the source in your cookie, by properly QAing. Hope this helps.!

    | mememax
    0

  • Patrick, Thank you very much for taking the time to respond and I apologize for not responding sooner. This issue is a real head scratcher as it turns out.  I think what's going on is my client has multiple gmail accounts with various pieces of key information about her business spread across each one.  I think it's going to have to first be untangled; getting info off some of her gmail addresses and then added to her one business gmail (like her location for example). Some other advice I'd received when calling Google My Business is that all that stuff is managed there.  That's what led me down this path.

    | therealfudgypup
    0

  • Hi Silviu, My apologies for the delayed response, google is considering your mobile version as the primary version of the website, since it doesn't project the two version of the website. I would suggest getting your website URL structure reviewed from a SEO professional, along with a detailed SEO audit as there can be other aspects which might  be impacting SEO of your website. Feel free to respond and ask further questions. Regards, Vijay

    | Vijay-Gaur
    0

  • Hi There, Sub-domains are treated as a  separate entity now, and they don't add to SEO value of the main domain. Any links within the subdomain to the main domain are treated as external links. Since links are coming from an external domain but the same IP, this may be treated a low-quality backlink for the main domain, though people are divided over this but it can neutral to negative impact instead of being positive. As search engines would consider this as unnatural linking. Sub-Folders are treated as part of the domain and pass all the SEO value when connecting internally. Here is a response from Rand Fishkin to a similar question Subfolders are the way to go, but they're hard to do for a lot of organizations. Many CMS' (like Hubspot) make it quite challenging to host a Wordpress installation on a subfolder, but subdomains are pretty easy. Hence, when choosing where to host a blog or a separate content section, many folks go with the easier route rather than the one that requires a lot of technical effort and webdev/engineering time. However, that doesn't mean that they're not losing out - I'd wager that all of those companies would see a bump if they moved their blogs to a subfolder of the same domain. We see this in example after example when sites invest in it, and you can see plenty of folks discussing their own experiences in the comments of the Moz post you linked to. _Source:  https://www.quora.com/Which-is-best-for-search-engine-optimization-a-blog-subdomain-or-blogging-at-a-blog/answer/Rand-Fishkin?srid=2Gsa_ I hope this helps, feel free to ask further questions and respond to answer. Regards, Vijay

    | Vijay-Gaur
    0

  • Fantastic - many thanks Chris!

    | abisti2
    0

  • Hi Kerry, Magento handles this very well by default when you simply enable canonical tags. It canonicalizes all product URLs (no matter how many categories they are in) to the root level. Eg. for the following product URLs: www.site.com/category1/product-name/ www.site.com/category2/product-name/ www.site.com/category3/product-name/ Would all canonicalize to: www.site.com/product-name/ I've worked on many large Magento sites and this has always worked very well and I've never seen an issue with it. Cheers, David

    | davebuts
    0

  • Ah thank you - I was looking at the Behavior link above all that! Sorted now!

    | abisti2
    0

  • Good question, one that I don't feel like gets addressed enough. Yes, you should always include self-referring canonical tags. There's a few reasons for this, but primary it helps one version of your URLs get indexed. Here's a handful of cases where they're helpful: Some CMSs create URLs that are case-sensitive - i.e. URLs will resolve at /Some-page, /some-page, and /Some-Page HTTP vs. HTTPS - if you've gone secure, self-referring canonical tags can help search engines learn your new structure and drop HTTP URLs from the index quicker, or at least prevent both secure and non-secure from being indexed Absolute/Relative links - some development teams prefer to use relative URLs in links when working in Dev and Test environments, this is helpful for preventing unwanted Dev/Test URLs from getting indexed, but isn't ideal for SEO. This is where self-referring canonicals come in. WWW vs. Non-WWW - another safeguard to prevent indexing of both versions, even with redirects in place, it doesn't hurt to have a fall back URLs with Parameters - If your site appends parameters to URLs for any reason, self-referring canonicals will prevent indexation of /this-page?q=123 There's probably other reasons to add to this list, but these should be compelling enough to go ahead and add them.

    | LoganRay
    0

  • Hi Ray, I check your website on google and then it's codes. It seems, the description is being picked from a |   | Rename the DIV to something different. Regards, Vijay

    | Vijay-Gaur
    0

  • Hi All, So what shall I do to save my category pages from duplicate review? As Ajax and Iframe both google read. Thanks!

    | wright335
    0

  • Hi Kelly, A couple of inputs for you looks like you indexing your checkout pages , I would 'noindex, nofollow' cart pages as they are just going to dilute your authority through those extra pages. Also, check whether you are following the right URL structure for pages https://moz.com/blog/15-seo-best-practices-for-structuring-urls https://moz.com/learn/seo/url I hope this helps. Thanks, Vijay

    | Vijay-Gaur
    0

  • Hi Ana, Please check this article for Wordpress optimisation guide here on the Moz blog.  It explains in detail problems with tags and the best practices to use them.  Usually, you want to noindex the tags on your WP site - keep them for navigation purposes if you want, but letting them be indexed can lead to duplicate content issues. If you are already getting some traffic on Tag pages, you can 301 to them right pages. I hope this helps. Regards, Vijay

    | Vijay-Gaur
    0

  • I haven't yet, and I'm hoping to solve the bigger issue first I suppose. Essentially, i built the content on www.domain.com, the canonical tag is correct. www.UnknownSubdomain.domain.com is what's showing up in rankings. The entire URL structure of the ranking domain makes no sense. We have no record of the subdomain being created, when i ping the random URL I get our host ip. I'm working on creating the property in search console and will try to block it from there.

    | JordanNCU
    1

  • I would use canonical tags in this situation.

    | BCutrer
    0

  • Hi Bernadette, Thanks for your input. I guess my question, put more succunctly would be- when does "personalisation" cross the line to become "cloaking"? And how to avoid Google confusing between the two. By definition personalisation involves showing one set of content to one set of users, and at least one other set of content to at least one other set of users. I totally understand that essentially Google will only see one set of content as a "first time" user, but given than that content will not be the same as the content all other users see, I can see that at some point Google might mis-interpret this as a maliciouos technique. Maybe my concern lies in y ignorance over exactly HOW cloacking is carried out technically. Thanks

    | unirmk
    0

  • Hello, Thank you for this information, but I have a followup question. The links you sent me refer to images and PDF's, but this isn't relative to my situation. I need to write in follow/no follow and rel=canonical via htaccess because I do not know how to do it for each individual page on my ecommerce store - additionally, htaccess is easy for me to edit if ever I need to undo something and it is nice to have everything in one place. Can you give me a formatted example of how a follow/no follow and rel=canonical can be placed into a page via the htaccess file please? I intend on doing this for every product category, product and also my home page on my ecommerce store. Thank you

    | moon-boots
    0

  • That's great news Dusan - glad you figured it out -Andy

    | Andy.Drinkwater
    0