Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Theoretically, if the URL is blocked  by  robots.txt it should not appear in the index results no matter if they are in the sitemap but I have seen URLs indexed that are blocked by robots.txt but are in the sitemap and have good links pointing to it. If you want to block pages that have good links pointing to them, my advice is to remove them from sitemap. #justathought. About URLs from multiple domains, I personally create separate sitemaps for different subdomains and link to main sitemap and I see better indexing that way. Again, these are my personal experiences and not rules so please do keep that in mind as things can be different fro them.

    | MoosaHemani
    0

  • Thanks Tom, yes that's very helpful

    | abisti2
    0

  • CleverPhd, Really since to see a detailed yet to the point answer. Thanks for contributing, and being in the Moz community. Regards, Vijay

    | Vijay-Gaur
    1

  • Thanks Michael for the response. Excuse my poor description by indexing I was refereeing to my ever growing list of crawl errors. When you say "It looks like not only do you need to resolve any MANAdev issues but you need to do an audit on the site as I think you have several issues." Is there some obvious errors you can see? Thanks

    | tidybooks
    0

  • You can't get their domain validated in your Google Search Console account (unless they add you as a user to their account), however there is a workaround more or less described here: Getting CDN images indexed with Google In short: If your image is for example on https://12345.maxcdn.com/images/directory/subdirectory/image-filename.jpg You can add a CNAME (alternate domain name) to make the URLs formatted like this: http://images.yourdomain.com/images/directory/subdirectory/image-filename.jpg After that you can add "images.yourdomain.com" as a verified domain in Google Search Console and treat it as a new subdomain, incl. sitemaps etc. Hope this helps

    | ConclusionDigital
    0

  • Hi Amit, Thanks for your answer. That's what I thought and I implemented this. Hope this will do. Cheers, Chendon

    | RBijsterveld
    0

  • 1. Adding this doesn't change the way a transaction is tracked, but instead makes it so that you can view the 'event' that fired under the event tracking in GA. If you look under here (Behavior > Events) can you see the successful event and the source/medium that drove it? 2. Have you also tried the previously mentioned segmentation under User Explorer? None of these fixes are going to change the way the data appears in the acquisition report (switching to PayPal Payments Pro usually does the trick, but is more expensive). All we're really looking to do here is find another way to attribute the conversions to their source. I'd recommend revisiting my most recent reply on how to segment the users under User Explorer as this will likely give you the best insights. Let me know if you run into any issues there and I'll help you out.

    | TrentonGreener
    0

  • Hi, Thanks for your reply. I have Yoast SEO installed and can see the option to implement a canonical tag. It asks me to input a Canonical URL, with the description being "The canonical link is shown on the archive page for this term." Using the kangaroo example above, how will setting that Canonical URL help to provide me with separate page title elements for pages 1 and 2? At the moment the URL says www.blahblah.com/portfolio/kangaroos for page 1 and www.blahblah.com/portfolio/kangaroos/2 for page 2, but they both have the same page title of Kangaroo Portfolio | Domain. So I'm unsure what exactly I need to do in Yoast to differentiate the page title of page 2 from page 1 to remove the duplicate page title error I'm getting. Thanks.

    | onefoursix
    0

  • First of all, a quick question: How is a new domain going to influence user experience? Couldn't you redesign the website and keep the current domain? Domain's build up a track record with search engines, historical data is factored in. So yes, you will see a significant impact to your organic traffic. Anytime there is a brand new domain in the picture, you're essentially starting at ground zero. 301 redirects will pass traffic and authority eventually, and as per Gary Ilyes do not lose PageRank anymore. BUT, that doesn't mean you should add a bunch of redirects without a really good reason. I strongly recommend you reconsider switching domain names and simply redesign the domain you've already got.

    | LoganRay
    0

  • I don't understand the question. Perhaps submit a new question to create a new thread, and elaborate on the issue.

    | Everett
    0

  • Ali, If you are on WordPress you can try some of these. First, set up the free Yoast SEO plugin if you have not already. I have been working on this exact issue. The way Tags are set up (on the client site) we are getting duplicate pages. I am trying a couple of things: In Yoast XML Sitemaps > Taxonomies you can set "Tags" to "Not in Sitemap". Removing them from the XML Sitemap will at least lessen the frequency that Google accesses Taged pages. In Yoast Titles & Meta > Taxonomies you can select Index or Noindex for your Tags. Following the Tag idea, you might get a 404 if a Tag is changed or deleted. It's hard to diagnose 404s without seeing them. Consider a custom 404 error page, if you have not already. That will at least improve the user experience. With htaccess you set different redirects and custome 404 pages depending on the site section. Tags in WP don't seem to add much benefit, and probably cause more problems (like dupe content). Even if you are diligent with Tag consisitency, most blogs end up with nearly duplicate Tags. There are many Moz posts on the topic of duplicate content in WP specific to Tags that might further illuminate your question. Did you change anything universally, like switch to https? Missed redirects can cause a 404. The WP Broken Links in Tools will direct you straight to the 404 pages. Here are some more tips for dealing with 404 in WP: http://www.wpbeginner.com/showcase/6-best-free-404-plugins-for-wordpress/ The WP plugin "Redirection" is well regarded, but use caution with lots of ecomm url variables. Hope this helps!

    | TwoOctobers
    0

  • If you dont have access to the logs that could be an issue - not really any automated tools out there as it would need to crawl every website and find 404 errors. I haven't tried this - so its just an idea. Go into GSC download all the links pointing to your site (and from places like Moz, Ahrefs, Majestic) and then chuck that list of urls into Screaming Frog or URL Profiler and look at external links and see if any are returning a 404. Not sure if this would work - its just an idea. Thanks Andy

    | Andy-Halliday
    0

  • You can tell Google what URL parameters to ignore in Google Search Console. It's under Crawl > URL Parameters > Configure URL Parameters. Google does advise to use caution when changing how Google bot handles the parameters.

    | Brando16
    0

  • Hi Praveen, Additional to this, my this answer to a similar question might help you define the quality of the links. https://moz.com/community/q/how-do-you-handle-a-site-with-inherited-negative-links-but-no-penalty#reply_350787 Do check spam links manually, sometimes tools can generate false spam report based on their own metrics and you might lose a valuable link which might be bringing real target traffic to your website. Feel free to respond and ask any questions further. Regards, Vijay

    | Vijay-Gaur
    0

  • Hi, Although I agree with Andy Drinkwater, yet I would still go ahead and disavow these links in time. It's the right thing to do, you never know if the next Google update starts looking at these spammy links as well. Always do the right thing first, don't wait for to react to the situation when it arrives. I hope this helps, feel free to respond if you have further questions. Best Regards, Vijay

    | Vijay-Gaur
    1

  • Hi Paul, The difference would be made how you plan to use these domains, if they are going to inter-link to each other, they may be seen as Link-wheel  strategy (black hat strategy). If they are competing for same industry space (as in keywords) or working to benefit the same company together, then again that may be a problem. As for the same IP, there are several websites hosted on same server IP at a time. I hope this helps, please respond if you have further questions. Regards, Vijay

    | Vijay-Gaur
    0

  • Further to the above, I think I've solved that issue. There was an error on that product you used as your example, it was loading an attribute as a variation that didn't exist. I've removed it now.

    | SushiUK
    0

  • Hi Jeff! Form personal experience, we've seen out company's and clients' videos rank better with the YouTube URL than through their URL on personal websites. Like Linda said, YouTube has a very high domain authority and because of this, is more likely to rank at the top of organic listings than if the video was hosted through your site. Check out this Moz article (and video!) for some more information: https://moz.com/blog/video-seo-post-rich-snippets Let me know if you have any other questions!

    | BlueCorona
    0

  • This quote is from Moz's domain setup guide: "Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website)." I think that quote is pretty compelling towards the idea of subdirectories. But I think the value of subfolders versus directories certainly makes sense especially from a linking, age, and juice perspective. Ultimately, it comes down to you and what you want for the website.

    | BlueCorona
    0

  • No there is no such way to add a list of items to Google Search Console with URLs that you want have marked as fixed. Instead what you can do is filter on these URLs in the interface and then all at the same time you can mark them as fixed. Google Search Console doesn't allow you to update URLs based from the API.

    | Martijn_Scheijbeler
    0