Questions
-
Website blog is hacked. Whats the best practice to remove bad urls
Technically 404 means "temporarily unavailable but coming back later" so you might want to consider Status 410 instead of 404. You could also supplement it with Meta no-index, if you can't use the HTML implementation then fire the no-index directive through the HTTP header using X-robots: https://developers.google.com/search/reference/robots_meta_tag (scroll down a little to find the relevant part) E.g: "HTTP/1.1 200 OK Date: Tue, 25 May 2010 21:42:43 GMT (…) X-Robots-Tag: noindex (…)" ... something like that. You can't use Search Console to remove URLs from Google at all. The remove URL tool, only removes URLs one at a time and it only does so 'temporarily', the URLs pop back again after a bit. The best thing you can do is give Google some harsher directives and hope they listen, in a month or two most of those should be gone Don't use robots.txt on the URLs as, if Google can't crawl them it won't find the 410s or the no-index directives
Technical SEO Issues | | effectdigital1 -
Top pages according Link Explorer shows 301 pages and images
Usually this is accurate but it could involve you taking measures to insulate your SEO authority further. A 301 redirect won't transfer 100% of the link equity from one URL to another. If the pages are highly related and share much of the same content, almost all of the link equity flows through! If the pages contain significantly different content or are not related thematically, as little as 0% of the equity can flow through the 301 redirect (it's not a simple input / output equation) The SEO authority of a given URL is still partially (maybe mostly) defined by Google's PageRank equation. Whilst 'toolbar' PageRank is dead, 'real' PageRank (which SEOs have never seen) is still an integral ranking factor. Google still (for the most part) considers the web to be an amalgam of interlinked 'pages' (rather than websites, or domains). That's not to say that domain-level checks don't happen, they do. For the most part though, since Google lists individual web-pages in its results (not entire sites launching with a single click) - page level metrics remain extremely important. If you combine both of these pieces of knowledge, you'll see why Moz's link explorer may state that some of your URLs which now result in 301s, are worth more (or more attention) in terms of your SEO. Other tools like Ahrefs or Majestic will do exactly the same thing, it's not accidental. The fact is that a page with loads of great backlinks, will usually outperform another URL receiving similar calibur link equity which is then diluted (a little or a lot) through redirects (even including the mighty 301!) Due to all of this, whilst the 301 redirect is a great measure to translate as much equity to the new URL as possible, it's not 'as good' as having all of those links altered to point to your new resultant page. Link amends usually always out perform 301s if they are managed in their totality, the viability of getting every link switched over though (as the coding for those sites is not under your direct control) is minimal The suggestion is always to put the 301 layer underneath, but to get as many of your links actually shifted to point to your new address - as possible! Certainly your very best backlinks should be moved over. In this way, even if the new URL is a little different and Google's page comparison algorithm kicks off, you've partially circumvented some of the issue Due to all these factors, migrations of any kind (even internal ones) often result in slight traffic dips and dents. Although that's true; moving to new architectures which are better, unlocks the long-term 'space' to achieve more than you ever did before. Without growing room, you stagnate (and in the competitive world of internet marketing - that's a big no-no)
Other Research Tools | | effectdigital1 -
Duplicate content due to parked domains
Oh, wow - if you're talking a couple of years ago and major ranking drops, then definitely get aggressive. Remove as many as possible and Robots No-index them. If you've got the Robots.txt directives in place, Google shouldn't put them back (although, from past experience, I realize "shouldn't" isn't a guarantee). If you're down 90%, you've got very little to lose and clearly Google didn't like something about that set-up. Unfortunately, that's about the most drastic, reasonable option. The next step would be to start over with a fresh domain and kill all of the old domains. That could be a lot more hazardous, though.
Intermediate & Advanced SEO | | Dr-Pete0 -
Is my site penalized by Google?
Thank you Gary for your effort and great insight. I didn't realize we were ranking in Google search sites other than Google.com . But it explains why WMT says we are getting some traffic from Google. Unfortunately we are not showing up for our own domain for the term "Armia" for almost an year now. So it is not a regular recalculation. This is our main corporate site. Most of the blog updates and social channels are updated through two different product sites. The social channels connected is our own product site Good point the keywords. May be it is time to retire it. I knew Google stopped taking them into account. I didnt know it impacts adversely. Thanks again Gary. Aji
Intermediate & Advanced SEO | | ajiabs0 -
Is this site hit with Penguin or something else?
Okay. That makes sense. I will do a link profile check on each of the domains parked and 302 redirect on selected names. Others will be pointed to some other location to collect the type in visitors. Thank you.
Intermediate & Advanced SEO | | ajiabs0