Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • That is actually a good thought, but Kolea 10A is a little below average in traffic compared to the others.

    | RobDalton
    0

  • Hey Cyndee, Your issue has to do with how this is coded. Let me explain. Here's what your paginated numbers at the bottom look like in the code: <a title="2" data-bvcfg="3520493" name="BV_TrackingTag_Review_Display_PageNumber_2" data-bvjsref="http://improvements.ugc.bazaarvoice.com/0048-en_us/414441/reviews.djs?format=embeddedhtml&amp;page=2&amp;scrollToTop=true" <strong="">href="javascript://">2</a> Notice that the "href" parameter of the anchor tag has no direct URL and because of that Google doesn't crawl to the next page in the series because there's no actual link. What would be ideal is if you had the actual URL to the second page so that it is accessible to Google as the href tag. Granted, Google will likely come back to these pages with the more feature-rich crawler and be able to access the content, but that could potentially take a long time or in fact never happen. I believe this is a function of how BazaarVoice operates, although I haven't had enough experience with it to know. A view-all page would help you get around the problem, but again, I'm not sure how that works with regard to BazaarVoice. You can also use rel-prev and rel-next to connect the pages, but that directive often has spotty results. -Mike

    | iPullRank
    0

  • Part of me thinks this is just because so many live pages still have links to these URLs, and while Google knows they 301, some part of its index maintains the old URLs due to the strength of their presence in the link graph. It might be a technical limitation too, where Google uses synthetic links/URLs in their indices to represent redirected stuff. In any case, not a big concern for Moz, and shouldn't be for you either if you're redirecting a site. So long as the traffic to the pages from search goes back up and indexation looks solid, you should be fine.

    | randfish
    0

  • Are you sure that you have the right property selected in Google Search Console? there is a difference between www and non www. and http and https?

    | Stramark
    0

  • This was just put out https://www.seroundtable.com/google-sitemap-index-count-drop-is-a-bug-20608.html

    | EcommerceSite
    0

  • Hi Darcy! My name's David Black and I'm the Director of Customer Success here at SEMrush. The "SEO Ideas" (or, as I like to call it "Optimization Ideas") tool contained within your Projects is currently in beta. So, you're one of only a select few that have access to it as well as some other beta tools. It's not that we want to police this stuff because we love having beta testers! However, if you have any questions or suggestions about tools, especially in beta, I would definitely encourage you to contact us directly at mail@semrush.com. Our Support team is killer and our Product Owners are literally foaming at the mouth for any feedback you can provide. We look forward to hearing from you!

    | DavidBlack
    0

  • Thanks for the help.  We figured it out!

    | SOLVISTA
    0

  • It seems that the Sudden drop in indexed pages reported in  WMT might relate  to some reporting issues from Google - https://productforums.google.com/forum/#!topic/webmasters/qkvudy6VqnM;context-place=topicsearchin/webmasters/sitemap|sort:date

    | propertyshark
    0

  • Ah sorry, I misunderstood Louis - skim reading too quickly! No, don't bother if it is a nofollow link as this is essentially what Google does to a dofollow one. Apologies for the confusion. -Andy

    | Andy.Drinkwater
    0

  • WOW, this is an interesting thread. In theory, rel next prev is what Google wants you to go with. In practice, however, I haven't not seen this work as advertised by Google, and end up going to rel canonical in most cases. Here's one way to think about it: Allow non-filtered pagination for top-level categories to be followed, but NOT indexed. Give them their own rel canonicals (self referencing) and ensure the intro content (or any other static content on the page) only shows on the first page (which should rel canonical to / instead of /?page=1). This will ensure your product pages all have a path going to them. Use rel next prev here, which "may" help the main/first page rank better by consolidating ranking signals from the paginated set. For sub-categories and/or filters/facets use rel canonical pointing to the canonical version of that category page. None of this, however, takes care of the crawl budget issue on enterprise eCommerce sites with crawlable filtered URLs. Therefore, I also use the robot.txt file, or nofollow attributes in links to handle this. I don't often use the nofollow robots meta tag for a variety of reasons. Again, in the real world rel next/prev doesn't seem to be working very well. I haven't tried it in awhile, but it just doesn't seem to do what Google says it's supposed to do. This is why you see a lot of sites using rel canonical instead. I think we should think about this in terms of where you are in the architecture instead of trying to fit all pages into a single scheme. For example, some sites may even want to allow the first facet/filter to be indexable - such as "Filter by Brand" - because people are searching for it. If that's the case, I'll often advise they turn that into a real category instead of a filter. Another example, if you sell kitchenware and Pots & Pans is a category but Material is a filter you're missing out on a lot of searches for things like "Copper Pots & Pans" or "Stainless Steel Pans". It all depends on the situation, and has to be evaluated on a case-by-case basis.

    | Everett
    1

  • Hello aquaspressovending, A good person for you to follow would be Aleyda Solis. To answer your question, it is relatively safe to repurpose content across different pages, subdomains or even domains if done properly for different countries. You can choose a canonical version (e.g. the US version or UK version...) and connect the rest using a rel alternate href lang tag, which is explained in the checklist Patrick linked to above.

    | Everett
    0

  • No problem, its actually really easy: https://www.google.com/webmasters/tools/googlebot-fetch Once you have selected your account, add the URL and then submit to index. I would do the homepage first and for that page, use the "Crawl this URL and its direct links" option. Then for the subpages do the "Crawl only this URL" option. It can also help to do the "Crawl this URL and its direct links" for any of your top level menu items to help speed things up. "For example, i just checked a page and saw that some images weren't being indexed." Does your robots file allow specific access to those pages? If not, here is how you can set it to do so. This will also allow Google's partners to access your images. Add this to the bottom of your robots file: User-agent: Googlebot-Image Allow: /images/ User-agent: Adsbot-Google Allow: / User-agent: Googlebot-Mobile Allow: / User-agent: Mediapartners-Google* Allow: / Sitemap: http://www.YOURSITEHERE.com/sitemap.xml

    | David-Kley
    0

  • You could always set a no-index rule for the tag pages in robots.txt and then submit all the URL's you want to appear through Webmaster tools using the "Fetch as Google" option. Another thing I would do is to only have the URL's you want to be indexed and focused on included in the sitemap, and the rest excluded. This should cut back on the strange or redundant URL's getting indexed and showing up. Something like: Disallow: /tag/ or Disallow: /http://www.bestpracticegroup.com/tag/ Hope this helps!

    | David-Kley
    0

  • Comment out both.  The Rewrite Cond is the test condition that if satisfied, causes the rewriterule on the following line to be executed.

    | MichaelC-15022
    0

  • Hi Aron, There are 3 solutions for Parallax websites to make them SEO friendly. Read Here http://moz.com/blog/parallax-scrolling-websites-and-seo-a-collection-of-solutions-and-examples

    | Carla_Dawson
    0

  • Thanks Patrick. Appreciate the encouragement. I will provide a service that is above board and ethical. I know that there is a lot of interpretation on things and I would rather take the high road then get stuck in the mud over a few successful, short term leads. I am going to get back to the "other" guys and challenge them on their practice and try to encourage them to change practices. Doesn't matter what their response is, just my actions. Thanks again. Really appreciate it. Gary

    | gdavey
    0

  • Your misspelled domains should 301 to the correct domain. You can make any subdomains redirect to the proper TLD as well.

    | Highland
    0

  • Hello Tanveer. That's a fine format for breadcrumb linking and can help users navigate your site easier. Here's an article that discusses what you're asking: https://moz.com/ugc/guide-to-ecommerce-facets-filters-and-categories specifically mentioning: When building or rebuilding your categories you need to think in a hierarchical way as if you are building a pyramid; starting from your homepage, and mapping out to your various products. Often, I use breadcrumb navigation links to see how my structuring will be represented (a large white board is also a useful tool). This allows me to ensure that the structure makes sense going from a broad or main category to more specific ones. Narrowing down while adding value can be hard, but doing so will provide a better user browsing experience. If you're seeing a lot of duplicate categories you might not need as many total categories. Rand has also done a great WBF on this topic (information architecture) so feel free to learn from that as well: https://moz.com/blog/information-architecture-for-seo-whiteboard-friday Regarding breadcrumbs he says: By the way, for SEO purposes it does help if I link back and forth one level in each case. So for my hedgehog page, I do want to link down to my hedgehogs in military uniforms page, and I also want to link up to my adorable animals page. You don't have to do it with exactly these keyword anchor text phrases, that kind of stuff. Just make sure that you are linking. If you want, you can use breadcrumbs. Breadcrumbs are very kind of old-fashioned, been around since the late '90s, sort of style system for showing off links, and that can work really well for some websites. It doesn't have to be the only way things can work though. Hope this helps give you some clarity on internal linking!  Cheers!

    | RyanPurkey
    0

  • In that case I would say a phased approach would be fine. It is an issue that should be addressed, but I wouldn't classify it as "critical".

    | Everett
    0

  • Check out this awesome post by AJ Kohn To answer your question... "What I’ve observed over the last few years is that pages that haven’t been crawled recently are given less authority in the index. To be more blunt, if a page hasn’t been crawled recently, it won’t rank well." "The pages that aren’t crawled as often are pages with little to no PageRank. CrawlRank is the difference in this very large pool of pages. You win if you get your low PageRank pages crawled more frequently than the competition."

    | OlegKorneitchouk
    0