Someone else in the forum has posted a very similar question to you! My response there:
"It looks like you're not the only one: https://www.seroundtable.com/google-dropping-pages-out-of-the-search-index-27369.html
No response from Google yet."
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Someone else in the forum has posted a very similar question to you! My response there:
"It looks like you're not the only one: https://www.seroundtable.com/google-dropping-pages-out-of-the-search-index-27369.html
No response from Google yet."
Hi,
For your more generic keywords, perhaps general industry or overall business, I would keep those throughout the site. For more focussed keywords, maybe individual products, services, content pieces; I would keep those to individual pages.
A very brief example, if you were a car servicing centre, I would be using terms like "car maintenance" and "car repair" across the site, but focussing specific pages on "Tyres", "Brakes", "MoTs", "Car Air Conditioning", etc.
As I understand it, the way Moz crawls means it won't pick up HTML that is added dynamically. Without knowing which site you are talking about, I can't check, but I worked with a client who had a site that dynamically generated areas of the page after the first draw and that caused some issues with the Moz crawler correctly identifying issues on the site.
I would suggest checking with Search Console "Fetch as Google" tool just to make sure Google is "seeing" the page correctly, and, if so, ignore the alerts in Moz.
What do the link profiles for each of the pages look like? And what is the traffic source breakdown for the two pages?
Do you mean pages you initiate the search from, or the search results page? (I know these can be the same thing in some cases)
I would allow a page that you search from to be indexed, depending upon what it is used to search. Someone who has put a query into Google might find your search page useful in resolving their query.
I wouldn't allow search results pages to be indexed, for obvious reasons, even for specific searches links from other locations. You'd be better off creating category index pages or similar.
Firstly, I would definitely take the opportunity to switch to SSL. A migration to SSL shouldn't be something to worry about if you set up your redirects properly, but given that most of your pages aren't indexed at all, it is even less risky.
You will eventually get the traffic back, as far as how long, it's very difficult to say.
I would concentrate on crawlability, and make sure your structure makes sense, and that you aren't linking any 404's or worse. Given the size of your site, that wouldn't be a bad thing anyway.
From your description of your pages, I'm not sure there is any "importance hierarchy", so my suggestion may not help, but you could make use of Google's API to submit pages for crawling. Unfortunately, you can only submit in batches of 100 and you are limited to 200 a day. You could, of course, prioritise or cherry pick some important pages and "hub" pages, if such things exist within your site, and then start working through those.
Following the recent Google blunder where they deindexes huge swathes of the web and, in the short term, the only way to get them back in the index was to resubmit them, someone has provided a tool to interact with the API, which you can find here: https://github.com/steve-journey-further/google-indexing-api-bulk
Hi,
It depends what you want to report. One of the most common violations (in my opinion) is link buying, and you can report that by following the links here: https://support.google.com/webmasters/answer/93713?hl=en
Many other types of violation are automatically picked up (and I suspect generally link buying is picked up eventually). What specifically do you need/want to report?
Alex
Sorry, by link profile I meant how many links and of what quality did you have towards each page.
I might not have understood your question, so apologies in advance if that's the case.
Your redirects won't be temporary, they'll be permanent (301). As far as the search engines (and anyone else) are concerned, the location has moved permanently.
You can't really set a redirect (temporary or permanent) as nofollow. The redirect is a response code from the server, it's not a link. To be fair, you wouldn't want to set it to nofollow even if you could, you want the search engines to follow the redirection to the new place and index that.
An interesting question...
Directly, no, Google state that PPC spend won't affect rankings. However, it's an interesting concept, if Google knows that a term you are using as a PPC keyword is driving traffic with a low bounce rate, that would indicate the page is relevant... therefore should it not rank more highly organically... who knows... Even if it has an impact, I wouldn't expect it to high up the list, so I wouldn't include those pages/terms purely for that reason.
That said, Google Ads can give you some useful insight into Google's opinion of your pages for specific keywords. If your Ad Rank is low for a page/keyword combination, then it is unlikely to rank well organically. You can look at "Landing Page Experience" and to a lesser extent "Expected CTR" to get an idea of which areas to target, improving those metrics on Ads could, based on the actions you would have likely had to have taken, improve your chances organically too.
I've never come across any issues with using a variety of TLD from a purely SEO point of view. I would argue that they can have some effect on CTR (which could indirectly affect your rankings) if you have competitors that use "more important" TLR with the same subdomain, e.g. yourdomain.co.uk vs yourdomain.com, but in your case, with a company name, this seems less likely to be an issue. While I would recommend getting hold of the other domains (if they haven't already) and redirecting, I would suggest migrating to a different TLD if they have used the original domain for a long period.
I guess it depends on how long term you're thinking.
For a shorter term (and less risky) project I would definitely stick with URL 2; Carefully improve the content using elements from URL 1 and then redirect from 1 to 2.
If you're in it for the long term and are open to a (hopefully) short term reduction in rankings, (and therefore traffic) you could do it the other way round. The redirect would pass any link juice across, albeit slightly diluted, and it would not be unreasonable to expect URL 1 to start to rank in a very similar way to URL 2 over time.
If this were a new piece I would certainly guide you towards the longer, more descriptive URL but, given the performance of the existing page, it is a more risky strategy. Your current results clearly show that URL isn't a massive factor!
I don't think there is a definitive answer here, but I hope I have helped a little.
Google will often make up its own mind what to put in your description, especially if it thinks that your provided description doesn't match what it thinks the page is about.
I'm afraid your developer hasn't fixed the issue. I was looking in the wrong place (your screenshot identified the issue), the malware isn't replacing the meta description, but actually inserting text at the top of the page, but only when it detects that it is the Mobile Googlebot visiting.
Using Google Chrome Developer Tools you can set your user agent and see the issue yourself (See my screenshot)
If I were you, I'd disable all the plugins and then reload the page in Chrome with your user agent set and see if that helps, if not, I would look at your theme's JS/source files.
Have you considered the ongoing video series by Rand on Moz's Whiteboard Friday's? Up to date and a great format.
The first one is here: https://moz.com/blog/one-hour-seo-guide-part-1-seo-strategy
There are links to the others in the series so far at the bottom. (Apart from part 5, which hasn't been linked yet https://moz.com/blog/one-hour-guide-to-seo-technical-seo)
I may have misunderstood, but I don't think 22Eighteen was asking about having multiple sites with different TLDs targetting multiple areas, I got the impression that this was a single site that happened to have a provincial TLD.
Hi Davit,
This is a widely reported problem. Fortunately it is now being reported as fixed. https://searchengineland.com/google-de-indexing-issue-now-fixed-result-of-technical-issues-315058
Hope your site soon reappears!
With that many pages, I assume you are already carefully managing structure and crawlability. I wouldn't think that about few thousand pages on a site of a million would make that much difference (Your new pages only make up 0.1% of your total site), if they're linked properly and included in the site map(s) correctly you shouldn't have much issue. I certainly wouldn't suggest drip feeding them in.
1. I don't think it will, Google has got very good at ignoring these spammy sites. Creating large disavow lists isn't technically that hard, but I don't think I would spend the time doing it seeing as you haven't seen any impact.
2. I don't think either of the response codes you're returning are appropriate.
403 for the indicates that the client doesn't have permissions and therefore it could be inferred that the file does actually exist and therefore the link is valid, which is definitely not something you would want Google to think.
While you have disavowed the links you are 302'ing, I still don't think 302 is the right response. For a start, 302 has been superceded now anyway, but 302 indicated moved temporarily. That is certainly not the case. The page doesn't exist and never has. The only reason to 302 is if you are expecting traffic from these links, but I think that also sends a bad message to Google.
I would definitely suggest 404 for both cases.
What would a user who wasn't logged in see if they visited it?
Depending upon your company, I would suggest exposing the knowledgebase, that could possibly rank for people looking to solve problems that your products would help with. Same with the forum. If you chose to do that, you might be better off simply moving that content to somewhere else in your site structure.
For everything else, Google wouldn't typically be able to login, and what would you actually show it? It's not a customer, so it would have any tickets or products to list serial numbers of. You could detect that it's GoogleBot and show something else, but that's very bad practice!
Hi Jim,
The key thing, as I'm sure you're aware, will be to manage all of your redirects! It looks to me that there will be quite a lot of duplicated/similar content from across the three existing domains that you'll want to combine, and then ensure that old pages redirect to the new content; this makes it quite complicated and will require lots of custom redirects rather than a simpler bulk https://blog.lucybee.com/* -> https://lucybee.com/blog/*
Depending upon the new structure for the shop element of your site, you might be able to use wildcard redirects (E.g. https://shop.lucybee.com/collections/* -> https://lucybee.com/products/*) but you'd obviously need to be very careful to make sure they all match up.
I'm not familiar with Shopify, but I would suggest getting in touch with them and explaining what you plan to do. If I were carrying out your project I would want to set up a dev site to make all the content and structure changes, and then prepare my redirects, before then putting the whole thing live.
A crawl of the existing three sites before you start and then a map of your redirects is probably best. You can then bulk check the old URLs once you go live to make sure you haven't missed anything. (https://httpstatus.io/)