Questions
-
My keywords aren't performing for my Umbraco site
The first thing to look at is if those new pages and the new changes to any pages have even been crawled yet by Google. If not, then that is why you aren't ranking in the top 100 yet. To check if the page has been cached yet, paste that URL into Google.com and if there are no results, then it is not cached. If Google shows that page as a SERP listing, then it has been cached. To see the last date that Google cached that page, click on the green drop down arrow next to the URL and then click "Cached". Then you'll be able to see that last date that Google crawled that page.
Intermediate & Advanced SEO | | Nozzle0 -
Removing a site from Google index with no index met tags
This is the best response. Others have cited using robots.txt, that's a bad idea IMO. Robots.txt will stop Google from crawling pages, Meta no-index directs Google not to index a page. If Google can't crawl a page (due to robots.txt) then they won't be able to 'find' the no-index directive. As Jordan says, no-index should come first. When all pages are de-indexed, then OP can begin to think about robots.txt as suggested by Rajesh. OP could also combine Meta no-index with status code 410 (gone) to make it a stronger signal - though this is inadvisable with OP's situation (where the site will remain live for users, but be gone from Google). In the end, Jordan's reply is the best one which has been left here A final note might be that, instead of editing the HTML of all OPs pages, OP could fire no-index though x-robots via the HTTP header (which is often more scaleable)
Technical SEO Issues | | effectdigital1