Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Hi ozil! Did Eric's answer help? We'd love an update.

    | MattRoney
    0

  • Hey Janet, Like Jordan says, putting a bit of content around each to provide a bit of background info about each service would be good, rather than just a list of links to other companies. About the follow / nofollow issue, as long as the sites aren't spammy, there shouldn't be a need to nofollow them. I've heard it reinforced several times recently that as long as you trust the website you're linking to, there's no need to nofollow the link. Adding a nofollow to the link is like saying to search engines, 'I don't trust this guy' - but if you didn't, why would you link to them? Anyway sorry for the rambling but where the links are duplicated, nofollow, otherwise there's no need.

    | MrLeeB
    1

  • Thanks Rhonda! It sounds like this thread will benefit many people then.

    | Everett
    0

  • Ben, I'm assuming that you have 301 redirects properly set up for all of those page. Then, make sure  you've used the Google Change of Address Tool to properly tell Google that you've moved from one domain to another. If you are consolidating domains, then you need to verify all of those sites in Google Search Console and then use the Change of Address Tool. You don't mention how long it's been since you've set up those 301 redirects, but it literally can take months before everything is straightened out in Google. One thing you can also do is to look at the site's log files to see if Google is crawling those pages--it could be that they're not crawling. If they have crawled, then it might just take time before they're indexed.

    | GlobeRunner
    0

  • Have you got switchboard tags implemented? That should solve the issue.

    | bridget.randolph
    0

  • Hi Taiger, You can find instructions on how to do this at https://www.google.com/support/webmasters/answer/1235687 The menu itself is found at https://www.google.com/webmasters/tools/crawl-url-parameters You'll just want to enter escaped_fragment when they ask parameter, and answer "No" to the questions regarding whether this changes content on the page.

    | KaneJamison
    0

  • Thanks Eric, My title tags are already super-short (sometimes less than 40-45 characters and far less than 487 pixels) but are still truncated by Google as I showed you in my example...

    | trigaudias
    1

  • For those wondering, I haven't been able to find a solution and the people I have spoken to have agreed that we don't think this is possible (to warrant the amount of work we would need to do)

    | ThomasHarvey
    0

  • Finally we were able to get a 200 response. Thank you for your input.

    | ang
    0

  • If the pages are already indexed and you want them to be completely removed, you need to allow the crawlers in robots.txt and noindex the individual pages. So if you just block the site with robots.txt (and I recommend blocking via folders or variables, not individual pages) while the pages are indexed, they will continue to appear in search results but have a meta description of (this page is being blocked by robots.txt). However, it will continue to rank and appear because of the cached data. If you add the noindex tags to your pages instead, the next time crawlers visit the pages they will see the new tag and remove the page from the search index (meaning it won't show up at all). However, make sure your robots.txt isn't blocking the crawlers from seeing this updated code.

    | OlegKorneitchouk
    0

  • Thanks for the nudge Matt, We're in the process of working on the redirects now...and checking GA results based on Matt-Williamson's feedback. ~Caro

    | Caro-O
    1

  • If you look at those links in Google Search Console (crawl errors), you'll see that there is a date there. If there are some that show up with an older date (than yesterday, for example), you can mark as fixed. If those errors are still there, then they'll show up again.

    | GlobeRunner
    0

  • Personally, I'd say to make 100% sure all your 301 redirects are placed properly, and that each old URL is redirected to the absolute best new URL to get the user what they're looking for.

    | MattRoney
    0

  • I will agree with Eric on the maximum number of URLs you can add in one sitemap. Than you can add another sitemap and connect together. My idea is to create separate sitemap for blog, static content, main pages and ads. In my opinion, its ok to keep the URL in the xml site maps as far as the expired ad’s URL. Just a thought!

    | MoosaHemani
    0

  • During these events we show them the "behind the scenes" of our company as well as the manufacturing process and give them an amazing experience surrounded by our products. I believe that that qualifies as promotional--you are giving them a perk that others don't normally get. So, those links should probably be tagged as nofollow links. The post or article that is written should include an explanation that the writer was invited to the event--which would be clear when reading what they posted. Keep in mind nofollow links aren't always a bad thing--it's logical that your site have both links that have nofollow them and links that don't have nofollow on them. It's ultimately up to the individual blogger or author/site to decide, though, and I wouldn't obsess over these links.

    | GlobeRunner
    0

  • No problem! I meant to mention this in my first comment, but I also noticed that there's no robots.txt file in place. That's obviously not going to help your indexation problem too much, but nonetheless something you should know about.

    | LoganRay
    0

  • Shopify support telling me this isnt possible, do you know otherwise per chance ? cheers dan

    | Dan-Lawrence
    0

  • This should also lend some perspective as to why SEOs typically recommend at least 250 unique words per page ...    I think with Google paying more attention to Main Content separate from supplemental, etc.  code uniqueness may be less relevant than it has been in the past... but overall, you should still strive to ensure your pages have a clear distinct purpose/value conveyed by the main content of the page that is easily distinguishable from other pages of your site.

    | HiveDigitalInc
    0

  • Like others have mentioned, if your site has both the "healthcare solutions" and "healthcare identity solutions" categories (category and subcategory), then it would make sense to keep it that way. We should be able to navigate to each of those pages in the URL. If you just are referring to the URL, though, the keywords in the URL is such a small part of the search engine algorithms that it really won't make a huge difference--and I hardly believe you'll be penalized for it.

    | GlobeRunner
    0