Probably not wise to hide the business address, it might impact on how it ranks from a local point of view.
The only reason I can see someone would want to hide an address is if they wanted to shut down a location or move a location.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Probably not wise to hide the business address, it might impact on how it ranks from a local point of view.
The only reason I can see someone would want to hide an address is if they wanted to shut down a location or move a location.
If you are using niche specific directories I think they will be fine.
The directories you need to stay away from are sites like freeSEOdirectory.info
Do your research and see if the site is active, if they have quality content, if they have an approval process on new business owners been added.
It does sound like the site was legacy and the update did take too long to occur, we have seen this same thing with other business owners on legacy CMS's who have not moved over quickly enough so it is a common issue.
I guess the thing you need to think about is the following -
1. Were all the links disavowed pure spam, most of the profile looks branded, We have seen SEOs disavow quality in the past which can have an adverse impact on the domain (see attached) I presume the disavow may have been mostly legacy spam.
2. The domain may have been hit by Panda in the past (see attached) though without seeing internal data it is hard to give a more accurate analysis.
Hope this helps,
James
The thing with running a article based site it is not wise to use articles which are already indexed on other sites, this will cause the pages not to be indexed on your site and you will receive no organic traffic.
To prevent something like this you would need to use the Copyscape API on your site and pre check all content which is added to the site prior going live - http://www.copyscape.com/api-guide.php
The API calls are not cheap though this is a viable option for checking large amounts of content on a daily basis.
I would No Index TAG pages on your blog/website. The SEO benefit from these tags is limited the userbility benefit is evident.
Using tag pages is still not a bad idea as they can help with fine tuning your category level targeting.
Also the attached image is a great example to compare the two from this post - https://moz.com/blog/setup-wordpress-for-seo-success
I guess you could push the "Located in Canada" message on your domain, and on the meta description for example to increase CTR. If you want to increase authority and ranks it is a question for the whole domain.
This is an issue you will see in all markets, even in Australia for example .com.au is country specific TLD will do well, yet US companies on .com will still rank for long tail. You just need to push the "Buy local" message in your titles/ Descriptions where possible. My advice is test it and see what happens.
You could have implemented the Canonical tag on the site to stop any cross site duplicate content issues.
The only time I would remove URL'S is if you have a staging site with the same content on a different URL for example.
Upload slideshows on Slideshare.net Google loves the domain from what I have seen, if you have a great range of links to the Slideshare content you can have them ranking for great terms.
If you upload slideshow content to your own domain it may not be as effective.
1. MajesticSEO is another decent link tool, Ahrefs is very good as well.
2. I like White Spark for citation finding.
3. Optimizely is good with CRO it depends on your budget tools go from one price to VERY expensive some are 4,000+ a month.
4. Hootsuite for social management is good, I also like Topsy for social monitoring.
5. Wordpress is good if you use the Genesis framework ?
You can track beyond 50 with tools like AWR - (Advancd Web Ranking) you may need a proxy to run it depend on the number of keywords you want to track.
If you want estimate rankings for free then you can use (Google Webmaster Tools Ranking data or SEM Rush) which will show ranking data. That been said the data from SEM Rush and from GWT will be limited and will only be taken as a estimate.
To be honest I tested the tool by Citation Labs and found it to be quite poor for broken link building.
One of the best guides is found here - http://www.quicksprout.com/the-advanced-guide-to-link-building-chapter-7/
Further to that if you have a full licence to Screaming Frog try and use the second tab for external links and scrap whole websites and look for broken external links.
Overall broken links is a difficult area yet if you make the right contacts it can be really great, we have scored some .gov and .edu links in the past.
Also check how much traffic the tags are currently getting, one site I have looked at in the past had like 16k uv a month from some tags on the site so proceed with caution also I agree with the advice above as well.
Personally I do not mind this website here http://www.theiconic.com.au they are pretty switched on from an SEO & eCommerce point of view and user testing ect, some things I would change yet they have a good basis for what you are asking.
I mean most of the bigger sites like Amazon ect they are legacy so things need to be changed over time, I would look at new eCommerce sites which have been built up from the ground up with SEO in mind. Usually this is not the case and it involved fixing things over time/ adding on things
To be honest if it is all on the same sever with the same hosting information ect, if it not going to do much benefit from a long term point of view. I tested your site in one sever tracking tool and it shows up all the sites on the same hosting IP been 98% similar (Google would see the same data) that answers the reply above is it on the same C-Class.
| <a id="mfa99" class="domain-name"></a>[+] <a id="mfa100" class="domain-name"></a>kidsrcrafty.com | 98% <a id="mfa101" class="expand-indicator expand"></a>reasons why | | > 1 million |
| <a id="mfa115" class="domain-name"></a>[+] <a id="mfa116" class="domain-name"></a>www.dltk-kids.com | 98% <a id="mfa117" class="expand-indicator expand"></a>reasons why | | > 1 million |
| <a id="mfa137" class="domain-name"></a>[+] <a id="mfa138" class="domain-name"></a>dltk-bible.com | 98% <a id="mfa139" class="expand-indicator expand"></a>reasons why | Jan. 26, 2002 | 96566 |
| <a id="mfa154" class="domain-name"></a>[+] <a id="mfa155" class="domain-name"></a>dltk-teach.com | 98% <a id="mfa156" class="expand-indicator expand"></a>reasons why | Nov. 2, 2002 | 115330 |
| <a id="mfa172" class="domain-name"></a>[+] <a id="mfa173" class="domain-name"></a>dltk-holidays.com | 98% <a id="mfa174" class="expand-indicator expand"></a>reasons why | Oct. 16, 2002 | 230480 |
| <a id="mfa188" class="domain-name"></a>[+] <a id="mfa189" class="domain-name"></a>coloring.ws | 98% <a id="mfa190" class="expand-indicator expand"></a>reasons why | | 43523 |
| <a id="mfa204" class="domain-name"></a>[+] <a id="mfa205" class="domain-name"></a>kidzone.ws | 98% <a id="mfa206" class="expand-indicator expand"></a>reasons why | | 55624 |
| <a id="mfa218" class="domain-name"></a>[+] <a id="mfa219" class="domain-name"></a>dltk-poems.com | 98% <a id="mfa220" class="expand-indicator expand"></a>reasons why | | > 1 million |
| <a id="mfa229" class="domain-name"></a>[+] <a id="mfa230" class="domain-name"></a>dltk-enfants.com | 54% <a id="mfa231" class="expand-indicator expand"></a>reasons why | | > 1 million |
| <a id="mfa235" class="domain-name"></a>[+] <a id="mfa236" class="domain-name"></a>makinglearningfun.com | 52% <a id="mfa237" class="expand-indicator expand"></a>reasons why | April 2, 2006 | 178965 |
| <a id="mfa241" class="domain-name"></a>[+] <a id="mfa242" class="domain-name"></a>dltk-ninos.com |
I think the main question with link removal is the following:
Are you trying to get out of a Manual or Algo penalty.
1. If the link is no follow - do not worry about it (many of these junk sites are no follow)
2. If the site is using a BRANDED anchor text or a URL anchor text on it I would not worry to a degree.
3. The main links you need to look at are GENERIC anchor text based links, the thing with these dodgy submission sites is if they have generic follow links to your site then yeah I would try to delete them and disavow.
It is really a case by case scenario, another think you can do is use the Majestic Bulk backlink checker on the URL level for any link removal you can check 150 urls at a time so it is good on the fly.
To be honest Moz and even Raven and the others to a certian degree are only going to give you one level of KW data. If you are really keen on Keyword Research and want to do it right their are custom tools on the market to do just that an example of this is http://keywordsnatcher.com/
That been said MOZ tool set has the Keyword difficulty section http://moz.com/tools/keyword-difficulty which I guess is kind of what you need.
Really the question is HOW DEEP do you want to do keyword Research?
I have used Host Gator and GoDaddy in the past. To be honest I find Host Gator to be a better for what I needed at the time.
Overall these three solutions are entry level hosting providers.
My Advice is the following:
1. Check how much traffic is coming from this section, you can do this in landing page analysis on Google Analytic's or the tracking you use.
If you are getting a decent amount of traffic from these articles even if its long tail I would think of another strategy before slapping on a no index. Because when you do the traffic will go.
I have dealt with a similar strategy for a news website in the past, what many of the big syndication players do is take duplication content to rank on Google News for 30-60 days then they 404 the page, I have seen this numerous times, I do not know how viable the strategy is overall.
Ive also noticed some news websites play around with Canonical tags via various partners on duplication content and yes they also do some no indexing.
Really research this before you implement it, I have done a bit of News SEO for Australian sites its an interesting area with limited information online.
To be honest I wouldn't use Crazy Domains, they are a real night mare if you want to transfer domains, I have also seen many companies let Crazy Domains inventory expire as the auto renew set up is not great.
As stated above I would go with a Company like Ventra IP or Netfleet (these guys really know domains)
It is not just search query volumes local results are affecting.
I have a strong feeling it is playing on CTR across search results as well. I have seen some research on local keywords where we see the position 10 for a GEO term get 20% CTR overall.
But yeah things are changing with the way local results display and this change has been rolled out over the last few months and if not years. But the thing is Google also tests and changes results as well.