Questions
-
Is it possible to track rank history
Hi Anthony, Unless you find a campaign or individual who has been tracking results, either for a specific keyword or for a specific domain, you can't retroactively do this through tools like Moz. You can use Searchmetrics' domain visibility tool to see how dominant a site has been in its niche over time, however, which gives you a rough understanding of whether it has ranked well or poorly for its keywords over time: http://suite.searchmetrics.com/en/research/domains/visibility My former agency has an in-house process like this that we used daily (I wrote about how it works in this post, although it has been developed into a tool and is more sophisticated now). It's useful for when you want to look at historic performance but don't have specific ranking reports and Searchmetrics is a great public alternative. I hope this helps! Jane
Moz Tools | | JaneCopland0 -
Landing pages rank higher thank home page
Hi Anthony Kurt's answer was pretty dead on (thanks Kurt!). Just want to add some more detail about the disavow specifically. We've been told by John Muller in some of his Google Hangouts (he works on the webmaster tools team) that the disavow can take up to 6 months to fully take effect. This is because Google has to have crawled all the domains/pages in the disavow file, based upon their own crawl schedule (having nothing to do with when you submit the file). As they crawl the disavowed URLs, they then add a nofollow basically on those links. Sounds like you're doing the right things! I would also think about being sure to try to get traffic from other places in the meantime, even paid. I have heard John also suggest that you can start testing page quality with PPC traffic. I also believe evidence of other traffic sources, especially social, will help any site recover much more quickly.
Link Building | | evolvingSEO0 -
Boatload of 301 Redirects Question
If the pages have traffic, rankings, etc. then its better to have a relevant one to one redirect instead of letting them 404. Its always better to 301 redirect directly to the relevant page instead of having a chain of redirects to redirects. Having 75 new redirects won't slow down googlebot. It can handle quite a lot.
On-Page / Site Optimization | | MikeRoberts0 -
Should sub domain blog have www or non www
I believe it really comes down to personal preference. Generally, when you see sub.domain.com, you usually see the non-www version; however, it really does come down to personal preference. You are not going to be penalized or anything for using the www version. It is slightly less work for the user to not have to type www.sub.domain.com vs just sub.domain.com - but I don't think 4 extra keystrokes is going to be that big of deal. Mike
On-Page / Site Optimization | | Mike.Goracke0 -
Month old site and alreasdy ranks 3 for competitive keyword
Sucks, don't it? Google isn't perfect and there are still many flaws that let sites like this rise to the top. I've seen this happen many times (especially in the more competitive industries). This is the difference between black hat and white hat seo. White hat route: keep providing good material, get higher quality backlinks and eventually you'll overtake him (and the results will be more permanent). Fast & Risky Route: Looks like he just did a big social bookmarking blast to the sites. You can find these services around the web and order them for yourself (get a link report first and see if the sites match up). You'll be on even ground but the downside is the risk of going under next update. Best of luck, Oleg
White Hat / Black Hat SEO | | OlegKorneitchouk0 -
Does Google have problem crawling ssl sites?
Thanks for the replies. Think we have the http fixed and will work on footer area next. Thanks again for the heads up.
Web Design | | anthonytjm0 -
Link directories question
Do you know what, I knew that was going to be your response The amount of times I have heard of "reputable" SEO companies doing the same old things is truly horrifying. If they are like the others I have heard of, the likelihood is that they have probably made it quite easy for you to disavow the links by adding approx 20+ links across various pages from each domain? Therefore, when and if you disavow, just disavow the whole domain and that's all of them done. This is of course you are left with no choice but to disavow. Best of luck Anthony! Matt
White Hat / Black Hat SEO | | Horizon0 -
Copyscape Duplicate Content Ownership Question
Google is pretty good and identifying the original source of content; however, it isn't perfect. I suggest pursuing the options to have the content removed from the scraper/spam sites whenever possible.
On-Page / Site Optimization | | edwardrj0 -
Are templates considered duplicate content?
I think if you have very thin content and the source code is the bulk of the content lets say 80% source code 20% unique google may not look favorably on that.... Google has algorithms that put cookie cutter sites in a lower position than custom sites so it is important to have your code somewhat unique. But only have 2 of the sites the same probably is not "the" problem it mainly to penalize people who buy these money making sites for $500 that look the same as 200 other sites except for the content
Branding / Brand Awareness | | goodlegaladvice0 -
Where to find quality bloggers with seo background for sub contract work
Unfortunately there are a lot of, shall we say, 'lesser' quality candidates on those sites and you really have to wade through some poor candidates before finding decent writers. My advice would be to actively search for contractors you would like to use from these sites rather than posting a job offer. If you use an advanced search you should be able to find a quality candidate. You should also be aware that the quality candidates with a very good level of English are likely to charge far more than candidates who's first language is not English. You could try MyBlogGuest (I use the site quite often to find guest posting opportunities) but you may run into a similar problem there with receiving several candidates with a lower standard of English. It really depende on whether you require volume or quality. You could potentially have both but this will be very, very expensive. Whereas a candidate with a lower level of English may charge around $8/hour, you may find that a writer with a high standard of English could charge from $20/hour upwards. If you need a large number of articles and a limited budget, then you might have to compromise on the quality somewhat.
Content & Blogging | | Adam.Whittles0 -
What do you use for site audit
I use the following tools: Xenu - identifies broken links GSite Enterprise Crawler - identifies on page issues Google Cache, Google Webmaster Tools - finds crawling issues Scritch - finds server/platform type Ahrefs, Majestic, OSE - for link diagnostics SEO Book Bulk Server Header Tool
Moz Tools | | CatalystSEM0 -
Article on site and distribution, is it duplicate content?
To be honest, I don't get why to do that. I'd get the best article on my site, rewrite it (not spinning) and publish the simpler rewritten version. I believe that also the rewritten version shouldn't be used more than a couple of times. I think that the best practice would be to have the distributed articles point out to your on site article as the source of information but don't do too many of these. Coppied content of any type today is not healthy.
On-Page / Site Optimization | | BeytzNet0 -
Dupelicate content home page and custom page question
If the home page is ranking for "custom colored marbles" then I wouldn't change it and try to move that KW target to a different page. Your home page is your "custom colored marble" page. Double down on your efforts for optimizing the home page for that phrase, and then work on other phrases on other pages. Trying to switch the page that is ranking for a term when that page is already in the top 10 is most definitely a risky move I couldn't recommend.
On-Page / Site Optimization | | AdoptionHelp0 -
Waiting 3 days for Crawl Test to complete
Hey everyone! Sorry about the odd crawl test activity. If you don't mind shooting us an email at help@seomoz.org with your PRO email address and the name of the domain that is stuck we can push then through on the back end. I would grab more information here, but we can't since it is a public forum. :S Look forward to your emails and sorry about the wait!
On-Page / Site Optimization | | Nick_Sayers0 -
Which is better, a directory 301 redirect or each page in the directory?
Hi Anthony, First of all, it is always better to redirect URLs to individual pages than perform sitewide 301 directs. But in this case, if the individual pages aren't getting much traffic, it may not make much difference. If the articles are truly low quality, and you are worried about a future penalty, you may want to simply remove them without a redirect at all. Serve a 410 HTTP response (gone) instead, and carefully watch your traffic/rankings to make sure nothing drops. It's most likely Google is simply ignoring these pages. The best defense is to build up an offense of quality material so the bad doesn't outweigh the good. Hope this helps. Best of luck with your SEO.
On-Page / Site Optimization | | Cyrus-Shepard0 -
Domain.com and domain.com/index.html duplicate content in reports even with rewrite on
Hello Anthony, Saw this still open. If your index.html "Rewrite" code is accurate, could the issue be WWW, i.e. http://www.domain.com vs. http://domain.com? RewriteCond %{HTTP_HOST} ^domain.com RewriteRule ^(.*)$ http://www.domain.com/$1 [R=permanent,L]
Moz Tools | | SeanKoenig0