Questions
-
Export all keywords from Keyword Difficulty and SERP
Hi There, Thanks for writing us! Unfortunately, while it is true that you can export all of your existing keywords from keyword difficulty with the export CSV here:http://screencast.com/t/DSxnAnpJuGm This will not be able to include the SERPs within the CSV, downloading all of keywords with SERP's is not possible at the moment because it would sadly put a lot of pressure on the system. Though we may be able to expand our capabilities down the line for the moment this is a bit beyond our capability. Here is an awesome article diving a bit deeper into the tool so if you have any questions for me let me know! https://moz.com/help/guides/research-tools/keyword-difficulty
Moz Local | | Sean_Peerenboom0 -
Old pages - should I remove them from serps?
Since the campaigns are no longer available, I assume that these pages are no longer useful to the visitor. In which case, just put a note on the landing page that says "This offer is no longer available, but you might be interested in ...." In this case, the URLs remain in Google's index. Alternatively, you can remove the pages from the server and 301-redirect each URL to your current campaign(s) if that seems more appropriate for the user. Because of the redirects, Google will eventually remove the URLs from their index. In terms of SEO, the goal is not to lose any link equity coming from backlinks to those landing pages. If you simply use the URL removal tool, you'll lose any value from the backlinks. As Chris says, the canonical tag is relevant to duplicate content and not this type of situation.
Technical SEO Issues | | LauraSultan0 -
Why blocking a subfolder dropped indexed pages with 10%?
The thing is that I am checking indexed pages on a regular basis and usually the fluctuations are not big, only changes few pages. But never such manny pages. The traffic from organic did drop, but just slightly and rankings were never affected. But as you said, I will keep an eye on this.
Technical SEO Issues | | catalinmoraru0 -
Blocked URL parameters can still be crawled and indexed by google?
If you want to permanently remove URLs from the index, this is the basic process: Have your developer implement NoIndex, Follow to all pages that have the URL parameter you want removed. For example, if the URL contains categoryFilter= (like above), then add the NoIndex, Follow tag to the of the page. Do this for all URL paramters you want removed from the index. Make sure Google is allowed to crawl those pages. If they are blocked by robots.txt or told not to crawl them via Google Webmaster Tools, Google will not be able to see the newly implement NoIndex, Follow tag. Then, give it some time and wait. It may take Google a long time to crawl all of these paramtered URLs again. Fallout of the index might be slow. Once the URLs are gone, consider blocking the crawling of them via robots.txt or in GWT parameter handling.
Technical SEO Issues | | anthonydnelson0 -
Problem crawling a website with age verification page.
Hello Catalin, Our crawler will not be able to get past an age verification page. You will need to find or unlock a subfolder or subdomain to bypass this if you would like our crawlers to be able to get through. Luckily, Google's crawlers are a bit more thorough a will be able to index your site properly. We are hoping to add this ability soon and I hope you can find a way for us to get through in the meantime.
Moz Tools | | Abe_Schmidt0