Having the same problem - just says "Getting serp analysis failed. Please retry your search or refresh this page" on all browsers I try it on. Hard reloaded too and still not working.
Posts made by Frankie-BTDublin
-
RE: Keyword Research (Difficulty) tool not working. Please help!
-
RE: Forced to remove Categories with high volume & revenue
Thanks Will! Yep sounds similar to what I've sent onto Development, where the filters are actually those sub-category pages. Unfortunately they think it's going to be a huge amount of work, so now I need to show the value of creating these pages before they start working on it. From the Macro point of view, unfortunately, I had no choice and just had to redirect, which are all in place now. Painful to do when you know it's going to damage the performance, and after a couple of weeks it looks like the stats showing it already has
But great to have your feedback, will definitely give weight for my pitch to get those filters working for us! The top-level idea might actually be a great workaround for now too!
-
Forced to remove Categories with high volume & revenue
Hi everyone
I've been forced to remove level 4 & 5 categories (e.g. example.com/level-2/level-3**/level-4/level-5/**) from our website, even though they're getting plenty of traffic, revenue and are ranking for some of our keywords. The argument is customers were using refinement/filters more than clicking into categories, and a new backend system is coming into the business and these need to be removed anyway.
We've done this before and seen a drop in visibility, revenue & traffic in these areas, but we're going ahead with another batch of removals anyway. I was wondering if anyone has any experience in fixing a problem like this? I've been told the categories will not be returning and have to 301 them, so need to find a workaround to get eligible for ranking for these Keywords again.
I've been looking at using the refinements to make it look like a category (change URL to a clean one, update Page Title, Meta Description, H1, remove text from core page, when refinement is clicked) but not sure what kind of knock-on effects this will have, if it even works!
Hope you can help! I've probably missed some details so let me know if you need more info!!!
Thanks
-
RE: Use Internal Search pages as Landing Pages?
Hi Guillaume_L
I'd presume your idea is fine, looks like a good idea! As long as you keep the number of URL parameters to one or two like outlined in the article you shared, or you'll end up with millions/billions of URL's. It'll also help keep your Page Titles/Descriptions shorter if you only have 1 or 2 parameters set up. I'd suggest noindexing any pages that have more than 2 parameters chosen.
Seems like you'll be doing something similar to what AirBNB are doing, so I think you should be good to go!
-
Use Internal Search pages as Landing Pages?
Hi all
Just a general discussion question about Internal Search pages and using them for SEO. I've been looking to "noindexing / follow" them, but a lot of the Search pages are actually driving significant traffic & revenue.
I've over 9,000 search pages indexed that I was going to remove, but after reading this article (https://www.oncrawl.com/technical-seo/seo-internal-search-results/) I was wondering if any of you guys have had success using these pages for SEO, like with using auto-generated content. Or any success stories about using the "noindexing / follow"" too.
Thanks!
-
RE: Removing indexed internal search pages from Google when it's driving lots of traffic?
Hi effectdigital
Yep I won't be doing that! I don't think I'll be using the robots.txt file at all for this.
Thanks
Frankie
-
RE: Removing indexed internal search pages from Google when it's driving lots of traffic?
Hi Dave
I see Amazeinvent has "noindex/follow", on your search. Did you add afterwards???
Thanks
-
RE: Removing indexed internal search pages from Google when it's driving lots of traffic?
Hi Rajesh
**1- Add robots tag on these type pages - **
Any reason why you'd add "nofollow" on Search pages? I'd have though "follow" is a better option?
2- update robots.txt file - Disallow: /search OR /search/*
I done this before a traffic dropped a lot over the next 2-4 weeks. We regained traffic again when we removed
3- add new parameter in GSC as no URLs.
Done already, thanks
-
Removing indexed internal search pages from Google when it's driving lots of traffic?
Hi
I'm working on an E-Commerce site and the internal Search results page is our 3rd most popular landing page. I've also seen Google has often used this page as a "Google-selected canonical" on Search Console on a few pages, and it has thousands of these Search pages indexed.
Hoping you can help with the below:
To remove these results, is it as simple as adding "noindex/follow" to Search pages?
Should I do it incrementally? There are parameters (brand, colour, size, etc.) in the indexed results and maybe I should block each one of them over time.
Will there be an initial negative impact on results I should warn others about?
Thanks!
-
RE: Which product URL to include in Sitemaps?
Great, thanks! Was really hoping I'd be suddenly selling loads of wallets

-
RE: Which product URL to include in Sitemaps?
But in the sitemap??? Canonical tags aside, will "/women/accessories/wallets/" get an increased authority if I include "**/women/accessories/wallets/jet-set-double-zip-wallet/" rather than "/brands/michael-kors/bags/**jet-set-double-zip-wallet/" in the Sitemap?
-
Which product URL to include in Sitemaps?
Hi
Does the product URL's in Sitemaps affect the sub-categories authority too?
For example, if I have a product with 2 URL's and which have a canonical tag:
- **/brands/michael-kors/bags/**jet-set-double-zip-wallet/
- **/women/accessories/wallets/**jet-set-double-zip-wallet/
If I make the main URL "/women/accessories/wallets/jet-set-double-zip-wallet/" and set that as the Canonical URL & list that URL in the XML Sitemap, will it also mean the "/women/accessories/wallets/" category will get more authority and increase it's power to rank?
Thanks
Frankie
-
RE: Disallowed "Search" results with robots.txt and Sessions dropped
Hi Gaston
"Search/" pages were getting a small amount of traffic, and a tiny bit of revenue, but I definitely don't think they need to be indexed or are important to users. We're down in mainly "Sale" & "Brand" pages, and I've heard the Sale in general across the store isn't going well, but don't think I can go back management with that excuse

I think my sitemaps are sorted now, I've broken them down into 6 x 5,000 URL files, and all the canonical tags seem to be fine and pointing to these URL's. I am a bit concerned that URL's "blocked by robots.txt" shot up from 12M to 73M, although all the URLs Search Console are showing me look like they need to be blocked!
We've also tracking nearly 300 Keywords, and they've actually had good improvements in the same period. Finding it hard to explain it!
-
RE: On page Grader is not working on specific site
I had the same problem. Try remove the "https://", seems to work fine for me after.
-
Disallowed "Search" results with robots.txt and Sessions dropped
Hi
I've started working on our website and I've found millions of "Search" URL's which I don't think should be getting crawled & indexed (e.g. .../search/?q=brown&prefn1=brand&prefv1=C.P. COMPANY|AERIN|NIKE|Vintage Playing Cards|BIALETTI|EMMA PAKE|QUILTS OF DENMARK|JOHN ATKINSON|STANCE|ISABEL MARANT ÉTOILE|AMIRI|CLOON KEEN|SAMSONITE|MCQ|DANSE LENTE|GAYNOR|EZCARAY|ARGOSY|BIANCA|CRAFTHOUSE|ETON).I tried to disallow them on the Robots.txt file, but our Sessions dropped about 10% and our Average Position on Search Console dropped 4-5 positions over 1 week. Looks like over 50 Million URL's have been blocked, and all of them look like all of them are like the example above and aren't getting any traffic to the site.
I've allowed them again, and we're starting to recover. We've been fixing problems with getting the site crawled properly (Sitemaps weren't added correctly, products blocked from spiders on Categories pages, canonical pages being blocked from Crawlers in robots.txt) and I'm thinking Google were doing us a favour and using these pages to crawl the product pages as it was the best/only way of accessing them.
Should I be blocking these "Search" URL's, or is there a better way about going about it??? I can't see any value from these pages except Google using them to crawl the site.
-
Looking for list Pro's & Con's of removing Folder from URL?
Hi
We have a sub-folder ("/shop-by-department/") which is pretty much useless on our site and I'm looking to remove it. But the team want a list of the Pro's & Con's in doing so.
So for example I'll be changing www.example.ie/shop-by-department/furniture/beds/product-a to www.example.ie/furniture/beds/product-a
I know there will be an intial hit as Google adjusts to the change but think it's definitely the way to go. I was lookng for a complete list of the Pro's & Con's to send onto the team. It'll be going to the traditional marketing (print, radio, etc.) too so can ve top-level points too.
Hope you can help!
Thanks
-
RE: Expanding into new country & what to do with Seach Console
Thanks Rajesh! Very clear, will pass that on to developers!
-
Expanding into new country & what to do with Seach Console
Hi!
We're looking at expanding into new countries, and will probably go with the subfolder route. Our main website is focused on Ireland on Search Console (and probably always will be), so will this be affected if I add subfolders onto the end? And can I shop the main site from crawling the new URL's in the subfolder.
So if www.example.com is focused on Ireland, and we add www.example.com/de for Germany, can we let Google know not to index the German pages in Ireland? And will I need to do anything to the Irish version (e.g, change www.example.com to www.example.com/ie)
-
RE: Reason for robots.txt file blocking products on category pages?
Thanks again AL123al!
I would be concerned about my internal linking because of this problem. I've always wanted to keep important pages within 3 clicks of the Homepage. My worry here is that while these products can get clicked by a user within 3 clicks of the Homepage, they're blocked to Googlebot.
So the product URLS are only getting crawled in the sitemap, which would be hugely ineffcient? So I think I have to decide whether opening up these pages will improve my linking structure for Google to crawl the product pages, but is that important than increasing the amount of pages it's able to crawl and wasting crawl budget?
-
RE: Reason for robots.txt file blocking products on category pages?
Thanks AL123al! The base URL's (www.example.com/product-category/ladies-shoes) do seem to be getting crawled here & there, and some are ranking which is great. But I think the only place they can get crawled is the sitemap, which has has over 28,000 URLs on one page (another thing I need to fix)!
So if Googlebot gets to the parameter URL through category pages (www.example.com/product-category/ladies-shoes?cgid...) and sees it's blocked, I'm guessing it can't see it's important to us (from the website hierarchy) or the canonical tag, so I'm presuming it's seriously damaging or power in getting products ranked

In Screaming Frog, 112,000 get crawled and 68% are blocked by robots. 17,000 are URL's which contain "?cgid", which I don't think is too big for Googlebot to crawl, the websites has a pretty good authority so I think we have a pretty deep crawl.
So I suppose what really want to know is will removing "?cgid" from the robots file really damage the site? I my opinion, I think it'll really help