Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
-
-
When submitting a disavow list, should I put a COMMENT on each one? Does someone at Google read it?
-
If I send 10 at a time, when I send the second batch, should I re-include the first batch with it? So basically just upload the same file over and over with new additions.
-
-
No one at google reads these, it's all done automatically now, the comments are for your usage when you look at the disavow list at a later date.
Yes, when you submit your next disavow file, say you disavowed 10 the 1st time and wanted to disavow 10 more, so your 2nd disavow file will have 20 urls in it, 10 from the last time and 10 from this time.
It's best course to download the text file from before and add the urls you want to remove this time to that txt file, add a comment about that time to help you remember or inform others.
Here is an article from 2014 from moz.com that helps explain this further - https://moz.com/blog/guide-to-googles-disavow-tool
Remember you don't have to comment these, but at the very least, comment each time you use the disavow feature at least, this will make things easier on you.
Just be careful, removing even spammy urls could alter your domain authority or possibly worse. always be 100% sure if you want the urls selected to be ignored, this process can take as little as a month and as much as 3 months with some people taking a year to have a url disavowed. And if you make a mistake, that's about how long it can take to correct it.
-
This post is deleted! -
If those links aren't doing anything else but redirecting, yes, get rid of those, they aren't helping you and could hurt you if they haven't already.
If they were all from the same domain, i'd disavow that domain as well, but that doesn't seem to be the case. As it was mentioned above by another poster.
You shouldn't need to disavow eventwire.com unless there is a backlink from them to your site, and it's not helping anyone. But remember not to get carried away when using the disavow tool, just remember this nifty saying **" when in doubt, leave it out. when it stinks, disavow the links " .
** -
Thanks for the input!
What's your understanding of how sites recover from penalties now? I've seen a posting about how Panda and Penguin are now 'baked in' the search algorithm. Where as in the past, sites would have to wait till Google rolled out a Panda/Penguin update for sites to show any 'recovery'. So, if it is 'baked in', if I submitted my 'disavow' list a few hours ago, should I hopefully see immediate results now?
Thanks!
-
Panda is part of the google core algorithm now, penguin is not yet, but is expected soon ( sometime between now and the end of this year soon )
So since penguin isn't yet part of the core algorithm, the key is to submit your disavow list as soon as possible, since it can take anywhere from a month to three months for google to nofollow those links you sent. They start nofollowing the urls once you send it in, but for reasons I have yet to gather aren't as speedy with this process as it is when you use the google fetch feature.
My understanding of how site' recover now from penalties are two fold.
If it's a manual penalty ( such as they get a message from google in their webmaster console ) you fix what they mention and submit a reconsideration request.
If it's a algorithmic penalty, you scour the internet and read as much as you can until you get a little crosseyed then you read some more, fix the issues that relate to your penalty ( or what you suspect is your penalty ) and then you wait, you tackle other aspects of your site's SEO and wait until that specific algorithm is rerun / updated and run, then it's not even 100% that your site will recover right away, might take a bit of time after that.
So once you do recover, stress to your client, your boss, or yourself that this cannot ever happen again, and to never ever ever neglect your site's SEO again or this issue will only be 100 times worse.
Hope this helps!
-
You are soooo right about the scouring the internet till I'm cross-eyed!
hahaahI read this article: http://www.brafton.com/news/seo-1/penguin-to-join-googles-core-algorithm-what-you-need-to-know/ That 'expert' says Penguin has already been put into the core algorithm. ** I can't seem to find anything that "Google" has said, do you know?
Here's a quick rundown, my site www.stephita.com, used to rank on near the top of page 1 for google.ca on search term "wedding invitations", some time ago (I haven't been keeping track, but I believe maybe 1-2 years ago probably) my site has dropped to page middle of page 2. So I'm thinking my site got dropped about 10 spaces. If I was hit with any penalties b/c of Panda & Penguin, does a 10 place drop seem reasonable, or would it have been more? A few facts:
-
I never got a manual action
-
I believe Panda penalty b/c I had some 'black-hat seo' pages, like a cloaking page, and also some relatively THIN content.
-
I believe Penguin b/c I just did a backlink audit, and felt I had about 200 spammy domains linking to me which amounted to about 900 urls.
So with those 3 points, does 10 place drop seem reasonable, or does my site just suck
hahaha -
-
As for the article about penguin being part of the core, I have read a few articles back in January of this year about that but nothing instead it was an unnamed update and then the adwords update. So far no one has 100% declared that penguin was a part of the core update yet.
And from what you've mention a 10 placement drop doesn't seem like a algo penalty, more like some tactics that were working on your site stopped, or other sites have updated their content and appear more relevant, causing your site to drop to page 2.
Let me give an example as to what penguin penalty could be like, I have a keyword for a client that would not rank passed page 4, and has grown to stay on page 6 and 7. After a recent site redesign where I made SEO top priority when I launched the site, about 2 weeks or so after we saw that keyword rise, from page 7 up to page 3. Stayed on page 3 for a day or two, then dropped back to page 6 then to page 7, stopping right back where it was. This most likely is a penguin penalty, especially after seeing a not so awesome backlink profile with excessive anchor text for that keyword to directories galore as well as comment spam.
I think if you were hit by an algo penalty that you'd be suffering ranking issues on a much more severe scale. I don't think your site sucks, it's just that you always have to keep your site up to date, gone are the days of posting content and walking away. You need to creating new content, promote it correctly, improve it, constantly check backlinks, check competitors, stay on top of current trends of your industry. And changes you make today, won't really show results in SERPs as fast as you'd like. Sometimes it can take time.
-
Oh.. I was hoping it was a 'penalty' b/c at least there is hope that fixing those issues would make Google smile at my site again
I've basically started this 'clean-up' process the past 3 months, where I've used the MOZ tools to identify "high/medium priorities".I've put in the effort to remove all "duplicate pages" which were marked as HIGH, and I've adjusted the META TITLES which were marked as MEDIUM. According to MOZ crawl report, I'm down to a few issues vs. 900 or so when I first ran the report a few months ago.
As for the Penguin fixes, I only discovered this 3 days ago, and submitted my first DISAVOW list yesterday which contained 900+ urls and about 210 domains.
** If the 10 placement drop wasn't a penalty, I can only hope this 'clean-up' attempt can only help? But the big question is timeframe...
Is it possible Google would have only assessed my site a MINOR penalty? Or is it either nothing or SEVERE? -
***Seems something is going on with your internal link structure, many many many weird urls with duplicate subpages:
Seems it's a .js subpage loop going on, http://wordpress.stackexchange.com/questions/93844/child-pages-loop
So that handles the huge page total issue, get this handled and maybe peek into other pages and see how they are worded and structured, make sure every page has it's h1 tag but only one h1, h2 - h5 for anything else if it needs it. For a 10 position move, I wouldn't imagine there is a huge number of issues going on. It's really a matter of finding them.
End Update
Curious as I went to run screaming frog on your site and its saying it's only 30% done crawling your site at 8,549 pages, then is showing me 33,359 pages and counting to go!
Do you really have this many pages?
If so, do you need them all?
If you have 30k + pages, odds are you might have some bad pages from all that. Having this amount of pages makes it very hard to do good SEO unless you've been doing good SEO since day one.
I'd say it's not a penalty since you only went from page 1 to page 2, that happens even on good days, often is a sign of competitors one uping you somehow.
If you can, I'd suggest pruning a good portion of these pages unless you have to have them, like they are product pages or such.
At the time of writing this, it's still totaling your total pages it's up to 53,932 pages now and still counting, screaming frog is at 29% to go from 31%.
-
Thank for helping out with the site audit! I truly appreciate it.
How did you detect the loop in the link structure?
Regarding the page count, the current count I thought was < 1000 based on MOZ Site Crawl tool.
** Prior to January of this year (2016), I had a cloaking page that was running that basically generated pages with a different city reference. So my site came up for searches like "Wedding invitations Banff, W I Timbuktuu, W I any city". Obviously this was a black-hat tactic, which I figured out a few months ago is a NO-NO with the PANDA update. So I unreferenced the link from my link structure, and I also requested that the directory structure be removed from the Google Webmaster Tools. I just realized I didn't take that script offline, or have it re-direct - i.e. www.stephita.com/wedding/type_anything_here is still live. But there is no LINK from my site that would go there anymore. ** I'm thinking I should probably do a 301 redirect to my homepage instead now.
So I'm curious as to how Screaming Frog is seeing these pages, if it is indeed these pages. www.stephita.com/wedding/____________________
Thanks!
-
I ran your site url through Screaming Frog it found more pages than I expected at 30% and was still counting so I took a quick glance at the urls it was returning and found hundreds of urls containing the same words like " vendor " , " foundation " , " js " over and over and over again. Then just did a search for " .js subpage looping ".
About moz saying what your page count was, this is what threw me off, since after I saw the ungodly high amount of urls I did a site:yourdomain.com search in google to see how many pages were being indexed and was returned a normal amount somewhere around 1,240 pages.
The cloaking / anything blackhat is a big deal to get off as soon as possible, even the left over script could be seen as an issue. But in this case it's also causing extra pages to be seen so removing that script will help clear up a great bit of google's confusion, since if screaming frog was having issues so was google then, and could even be why you dropped in ranking.
Screaming frog saw these pages as various different ones:
All from the article subpage, so I guess that's the home of that script or at least where it starts to make new pages.
-
I don't know how that .js subpage is looping in the address bar. I'm just using the foundation.zurb.com code to develop my site as mobile friendly.
I do have an idea why the articles page might be spinning out of control. It's a dynamic page that pulls my articles content, and it's dependent on the URL, so technically I could have an infinite number of pages spawning. I will correct the script to realize this error, and have it redirect to a base page.
As for the other previous cloaking pages, I'm assuming if I do a 301 redirect to the homepage, that would be better than to DELETE the page entirely which would return a 404.
-
It's better to 301 a page if it's relevant than it is to let it go 404. However if it's a service or product you no longer carry, might be better off keeping it a 404 so the page goes away unless you think you'll be using that url again.
404s won't hurt you unless you have a huge number of them, but if you have a page you no longer want and wish for it to be forgotten about, you can let it 404 and will eventually go bye bye.