Who wins the race or the overall career?
Posts made by Deacyde
-
RE: Turn grey myself or rat on black hat competitors?
-
RE: Regarding FB advertisment
While I have nothing to add in regards to facebook ad cliks not showing up in Analytics, I do want to add caution that facebook, while produces a lot of clicks, impressions, likes and even website clicks, unless you have such a broad site that you can do the fly paper traffic tactic and lure some of those people in, I've found facebook's user base widely impulse driven rather than focus driven.
For broad niches this works well, for narrow niches it's a risk.
-
Would you consider these domains spammy or at risk for penguin penalty?
I'm doing a bit of competitor backlink auditing and am curious how you view these domains?
I also am doing this for others as a comparison and growing reference maybe of domains that they can crosscheck with their own backlink profile and make sure these aren't on there, in preparation for penguin algo release to core.
Either chime in with known domains to be spam, and I will create a live list based on answers here.
Domain Domain Rating
directoryworld.net 64
prlog.ru 60
torchbrowser.com 60
jzip.com 58
localbiznetwork.com 58
gimpsy.com 54
pageonepower.com 52
nashvillelife.com 49
2thetopdesign.com 49
seocounsel.com 48
healthywealthyaffiliate.com 48
hollinslegal.com 48On a working Disavow list either in root domain or extended url form
seocourt.com
wldirectory.com
onpaco.com
lostdirectory.com
ligginit.comligginit.com
alivedir.net
tiptopdirectory.com
onemilliondirectory.com
submittolinkdirectory.comI've read that sitewide footer links even from parent company sites will only count as one, regardless of how many links they have, I was wondering if that was true as well?
-
RE: Google Local Result- Competitor Ranks without an Address. Why is this happening?
Thanks, yeah I'm pretty sure national SEO is just to work on organic, but never hurts to ask and be 100% sure.
I've already asked one way, a while back, but really boils down to since we don't have physical locations and those drop shipping places are just affiliate services we can access, doesn't warrant the same local search options as say a Burger King or Walmart having those national physical listings.
Thanks anyways
-
RE: Google Local Result- Competitor Ranks without an Address. Why is this happening?
While not related to this exact issue, I did have a question about SAB in general.
If you were a nationwide distributor able to drop ship from various locations across the nation, would we use a sab and include the whole nation or is there not a feature for that sort of issue yet?
Local SEO for a nationwide service doesn't seem like a thing yet, or maybe I'm missing something
-
RE: How long does Google takes to re-index title tags?
I can also add that Google might also just insert it's own title tag for your page, and completely disregard you title tag you want. Why? Because they feel they know better, in my experience, they don't, it makes no sense to me at all why Google would basically slap your hand and say " No, bad, this is what we're using ". When it's your site to begin with.
But it is Google and at the end of the day, it's their search engine = their rules.
So all that can be said is, make sure your title tag is within character limits, and relevant, index it and wait. It's kinda like rich snippets and search bars for brand name searches, they choose when those show up as well, it's really odd how SEOs, SEMs and so forth squeeze themselves through Google's molds, yet Google just let's themselves hang everywhere without care.
But them be the breaks!
Just understand this ( Google > Us )
-
I understand it's been asked before, however moz staff is telling me keyword's capitalization is treated as a separate keyword.
So there I was looking through my rankings like every other day, when I see a lowercase and uppercase version of a keyword. Most times I see this, I see them with the same rankings, and even researched this about 5 months ago, when I came to the conclusion that google treats them the same way.
However, this day I saw them as different ranks, same keyword, only capitalized 1st letter of the 2 word keyword. I asked moz staff about this, as I felt it was an error. But was met with the answer that google does indeed treat these keywords differently. My line of thought was that the rank checker didn't check both the lowercase and uppercase keyword at the same time, and SERPs happened to change when the second word was checked for rank, returning a different rank.
So now, I am in doubt again, as to uppercase and lowercase keywords being different or the same in google's eyes?
I honestly don't know why a uppercase keyword would have different motive from searcher than a lowercase when many time not, searchers can't even spell the keyword correctly.
-
RE: Huge amount of backlinks detected - what to do ?
As Eric has mentioned above, you really should be doing backlink audits every month or every other month just to keep your site healthy and free from possible penalties.
Using a spreadsheet, entering in all backlink urls, fill in all other data you can find on them, DA, PA, backlink profile themselves, are they a directory, comment spam or are they even relevant to your site, or are they using many exact match keyword anchor texts and especially check if the backlink is still active, does it have a php error and so forth.
Fill out the spreadsheet to the best of your ability and go through and remove site's that are considered no good.
Be careful using disavow tool, google starts working on it once you upload it, if you made a mistake you can reupload the disavow file with updated changes. Make sure you have a copy of all disavow files you send to google, since it will help you very much in the future.
-
RE: Tool to identify duplicated content on other sites
One great way to do this is to take a sentence or two and search it in google surrounded by double quotes, like:
" A sentence I want to google search to see if other sites come up using these sentences "
This is one of the best ways to find external duplicate content
Can always use this - http://www.seoreviewtools.com/duplicate-content-checker/ to doublecheck google results, as well check within your site for duplicate content.
Hope this helps.
-
RE: Backlinks on sites that aren't relevant, how does google determine relevancy?
Thank you both of you for the replies.
I try to report as I find them, but will be even more vigilant now about it.
As for Rebecca's questions: It's a algorithmic bad ranking penalty, still call it a penalty since it's penalizing us and reason I think it's penguin is the site in question has traffic patterns that best fit a site suffering from panda and penguin issues. Panda is part of the core algo and hasn't recovered from a single keyword and it's synonyms ranking better than page 5. Penguin is the only algo left that corresponds to the traffic being lost as far back as April 2012. Other reason I say it's penguin is the client's backlink profile is horrid, over 20% follow links are that keyword or synonym anchor text to crap sites, if that's not the trigger for the penguin issue then I'm getting results from doing the wrong things.
Both our redesign and link cleanup have directly increased keyword ranking however every increase that happens isn't passed page 5, I've seen it one page 4 before but only for a day, nothing longer. So it seems to be able to rank higher but is kicked back almost right away faster than it increased. Really seems like it's being " adjusted " after going passed page 5.
-
RE: How to use long tail keyword?
It helps a lot like how buying smaller items from the grocery store can add up big and fast.
Say you have a page setup for the keyword SEO, and inside that page you have various other pages off of that one using various longtails like one handles " SEO basics for beginners " with a guide using that keyword and synonyms. And you had a few more that branches off out from the SEO keyword page.
You could use the traffic from the long tail keywords to funnel into the SEO keyword page, helping you rank better ( as long as the content is appealing to the users ) for the keyword SEO.
It's a strategy mindset that wins with longtails as apposed to a brute force approach.
As well SEO having 5,000 per month traffic is a much higher competition than SEO basic for beginners at 40, meaning you have a better chance at ranking higher 1st with that longtail word and having traffic that could be funneled into the harder competing keyword, which at long as you get good metrics from your traffic to that SEO page will increase your overall ranking for SEO as well.
-
Backlinks on sites that aren't relevant, how does google determine relevancy?
I have a competitor who I've talked about on here before about how they passed us in ranking for our main keywords and that they were a new site ( less than 6 months old ).
Well after further digging a few things were found out:
- Our site seems to have a penguin penalty causing our keywords ( a group of synonyms as well ) from ranking better than rank 30.
- As well our competitor has been using this weird backlink tactic of using SEO sites ( a friend? ), a parent site to prop it up ( footer links and others ) as well has links inside articles that aren't relevant as in having a commercial niche page inside a lawyer site page talking about phone dialing. Then the low DA directory sites.
I'm curious as to if penguin algo will catch them once it comes out, if not penguin a manual review would certainly catch their tactics, but this site is doing everything else other than proper SEO.
What do you think?
-
RE: Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
It's better to 301 a page if it's relevant than it is to let it go 404. However if it's a service or product you no longer carry, might be better off keeping it a 404 so the page goes away unless you think you'll be using that url again.
404s won't hurt you unless you have a huge number of them, but if you have a page you no longer want and wish for it to be forgotten about, you can let it 404 and will eventually go bye bye.
-
RE: Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
I ran your site url through Screaming Frog it found more pages than I expected at 30% and was still counting so I took a quick glance at the urls it was returning and found hundreds of urls containing the same words like " vendor " , " foundation " , " js " over and over and over again. Then just did a search for " .js subpage looping ".
About moz saying what your page count was, this is what threw me off, since after I saw the ungodly high amount of urls I did a site:yourdomain.com search in google to see how many pages were being indexed and was returned a normal amount somewhere around 1,240 pages.
The cloaking / anything blackhat is a big deal to get off as soon as possible, even the left over script could be seen as an issue. But in this case it's also causing extra pages to be seen so removing that script will help clear up a great bit of google's confusion, since if screaming frog was having issues so was google then, and could even be why you dropped in ranking.
Screaming frog saw these pages as various different ones:
All from the article subpage, so I guess that's the home of that script or at least where it starts to make new pages.
-
RE: Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
***Seems something is going on with your internal link structure, many many many weird urls with duplicate subpages:
Seems it's a .js subpage loop going on, http://wordpress.stackexchange.com/questions/93844/child-pages-loop
So that handles the huge page total issue, get this handled and maybe peek into other pages and see how they are worded and structured, make sure every page has it's h1 tag but only one h1, h2 - h5 for anything else if it needs it. For a 10 position move, I wouldn't imagine there is a huge number of issues going on. It's really a matter of finding them.
End Update
Curious as I went to run screaming frog on your site and its saying it's only 30% done crawling your site at 8,549 pages, then is showing me 33,359 pages and counting to go!
Do you really have this many pages?
If so, do you need them all?
If you have 30k + pages, odds are you might have some bad pages from all that. Having this amount of pages makes it very hard to do good SEO unless you've been doing good SEO since day one.
I'd say it's not a penalty since you only went from page 1 to page 2, that happens even on good days, often is a sign of competitors one uping you somehow.
If you can, I'd suggest pruning a good portion of these pages unless you have to have them, like they are product pages or such.
At the time of writing this, it's still totaling your total pages it's up to 53,932 pages now and still counting, screaming frog is at 29% to go from 31%.
-
RE: Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
As for the article about penguin being part of the core, I have read a few articles back in January of this year about that but nothing instead it was an unnamed update and then the adwords update. So far no one has 100% declared that penguin was a part of the core update yet.
And from what you've mention a 10 placement drop doesn't seem like a algo penalty, more like some tactics that were working on your site stopped, or other sites have updated their content and appear more relevant, causing your site to drop to page 2.
Let me give an example as to what penguin penalty could be like, I have a keyword for a client that would not rank passed page 4, and has grown to stay on page 6 and 7. After a recent site redesign where I made SEO top priority when I launched the site, about 2 weeks or so after we saw that keyword rise, from page 7 up to page 3. Stayed on page 3 for a day or two, then dropped back to page 6 then to page 7, stopping right back where it was. This most likely is a penguin penalty, especially after seeing a not so awesome backlink profile with excessive anchor text for that keyword to directories galore as well as comment spam.
I think if you were hit by an algo penalty that you'd be suffering ranking issues on a much more severe scale. I don't think your site sucks, it's just that you always have to keep your site up to date, gone are the days of posting content and walking away. You need to creating new content, promote it correctly, improve it, constantly check backlinks, check competitors, stay on top of current trends of your industry. And changes you make today, won't really show results in SERPs as fast as you'd like. Sometimes it can take time.
-
RE: Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
Panda is part of the google core algorithm now, penguin is not yet, but is expected soon ( sometime between now and the end of this year soon )
So since penguin isn't yet part of the core algorithm, the key is to submit your disavow list as soon as possible, since it can take anywhere from a month to three months for google to nofollow those links you sent. They start nofollowing the urls once you send it in, but for reasons I have yet to gather aren't as speedy with this process as it is when you use the google fetch feature.
My understanding of how site' recover now from penalties are two fold.
If it's a manual penalty ( such as they get a message from google in their webmaster console ) you fix what they mention and submit a reconsideration request.
If it's a algorithmic penalty, you scour the internet and read as much as you can until you get a little crosseyed then you read some more, fix the issues that relate to your penalty ( or what you suspect is your penalty ) and then you wait, you tackle other aspects of your site's SEO and wait until that specific algorithm is rerun / updated and run, then it's not even 100% that your site will recover right away, might take a bit of time after that.
So once you do recover, stress to your client, your boss, or yourself that this cannot ever happen again, and to never ever ever neglect your site's SEO again or this issue will only be 100 times worse.
Hope this helps!
-
RE: How to tell when a directory backlink or other backlink is worthy of disavow tool? Especially when a keyword is not ranking passed where it should.
Hello Robert,
Hope you're doing well,
Seems I just didn't include all the info I needed, making posts as if you all have access to the data I have in my head is a mistake

What I meant is I have 500 urls, after doing a full link audit, 153 of those urls are trashy directory or comment spam backlinks with very spammy site templates, also they often times not had anchor text with main keywords of ours.
Now yes, 153 out of 500 urls isn't that bad, however the 153 urls are more than 3 years old, and our total backlinks has grown from 160 ( a year ago ) to 500 ( today ) meaning at one point in time, we did have a majority of backlinks that were from trashy directories and comment spam, in fact last year ( 160 - 153 left us with 7 possibly ok backlinks ) was the most recent, so it's a very high possibility these urls were the reason we got hit by penguin ( have data that suggest this as well ).
It's easy to get fixated on one SEO keypoint or another, but when you do, just remember to follow the data trail and make sure to see the other SEO footprints as well during that search, you can find a good bit.
Even answering questions from others here helps me find further keypoints to highlight and go through.
-
RE: Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
If those links aren't doing anything else but redirecting, yes, get rid of those, they aren't helping you and could hurt you if they haven't already.
If they were all from the same domain, i'd disavow that domain as well, but that doesn't seem to be the case. As it was mentioned above by another poster.
You shouldn't need to disavow eventwire.com unless there is a backlink from them to your site, and it's not helping anyone. But remember not to get carried away when using the disavow tool, just remember this nifty saying **" when in doubt, leave it out. when it stinks, disavow the links " .
** -
RE: Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
No one at google reads these, it's all done automatically now, the comments are for your usage when you look at the disavow list at a later date.
Yes, when you submit your next disavow file, say you disavowed 10 the 1st time and wanted to disavow 10 more, so your 2nd disavow file will have 20 urls in it, 10 from the last time and 10 from this time.
It's best course to download the text file from before and add the urls you want to remove this time to that txt file, add a comment about that time to help you remember or inform others.
Here is an article from 2014 from moz.com that helps explain this further - https://moz.com/blog/guide-to-googles-disavow-tool
Remember you don't have to comment these, but at the very least, comment each time you use the disavow feature at least, this will make things easier on you.
Just be careful, removing even spammy urls could alter your domain authority or possibly worse. always be 100% sure if you want the urls selected to be ignored, this process can take as little as a month and as much as 3 months with some people taking a year to have a url disavowed. And if you make a mistake, that's about how long it can take to correct it.