It is hard to tell if there is a problem without knowing what the site is.
It only has two external links? If the other pages have more, that would be your answer.
Also make sure you are checking the right address - try it with www and without.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
It is hard to tell if there is a problem without knowing what the site is.
It only has two external links? If the other pages have more, that would be your answer.
Also make sure you are checking the right address - try it with www and without.
I was looking for the same and found this: http://www.seomoz.org/q/company-nsphere-www-nsphereinc-com
The lack of pricing info and the sort of confusing interface with no direct upgrade options on the free version was enough for me to assume that it is probably a little shady. The sample directory in the above mentioned discussion confirms that suspicion.
Unfortunately there is no exact answer to this. Say you get a bunch of links from a variety of sites. The "better" quality sites like blogs that get a lot of traffic, are updated frequently and maybe have a higher pagerank, among other things, will see their pages indexed sooner than lower quality sites like directories. IN some cases, the site may be decent quality but is really huge so indexing new content on it may take longer too. So a link from a high profile site may get indexed almost immediately, but your links from crappyseolinks4salenow.com may take a few weeks. Then, once Google finds the new links, it will also take some time until they are factored in to Google's ranking of your site. And that can also depend on how often your own site is crawled and re-calculated.
So it could be anywhere from a few days to a few weeks. I have seen the effects of even a good link take anywhere from a day to a month to show up.
The bullshit detector is showing a 95% likelihood of this being bullshit.
I agree with most of the others here who point out that they are probably doing forum/comment spam, and junk directories.
Finding a good SEO and link building service is going to be difficult if you focus only on the quantity of links. Like Thomas said, sometimes a handful of really good links is all you need. A good SEO company or consultant will be able to find those. Instead of number of links, focus more on proven results. Ask for examples of results (not link counts, but traffic, keyword position, or conversion data) or references from clients who have been helped. THEN dig a little deeper and make sure they won't be doing things to get you in trouble.
I am assuming you intend to use no index only on the duplicate content articles. Using no index on everything would also prevent your content from being indexed and found through Google.
If you are using Wordpress or something else that will allow showing excerpts, you could try making the article pages noindex and show only excerpts on the main page and category pages which would be indexed and followed. I think that would make the articles not appear in searches and avoid duplicate content penalties, while allowing the pages that show the excerpts to still be indexed and rank OK.
The idea here is that the pages showing the excerpts would have enough text to help the home and category pages to rank for the subject matter and hopefully not be seen as what it is - copied content.
You will probably eventually get caught by the Panda, but this may work as a temporary solution until you can get some original content mixed in.
I am currently doing an audit of a site that has used clicksubmit for several months as its main method of link building. I am not very deep into the link profile yet, but it doesn't look good. Almost all the same anchor texts, the linking sites look as though they were made specifically to host links... so far the client site doesn't have a single link that I would consider "good". I hate having to report that.
I will check in with my findings later, but I have to agree with Michael York above - Great Reviews + Dodgy Looking Product + Sounds Too Good to Be True = A client website that will be looking for penalty remediation services soon.
You may be over-thinking this, or taking that "one keyword per page" advice too seriously.
You are right that you should accurately describe the category page, and using that keyword (and anything else that accurately describes the topic) is going to be part of that. Since that page is the one that is actually about the category, I would not un-optimize it. I wouldn't even worry about removing the keyword from the home page, either, since it is also relevant. You may want to link from Home to the category where that keyword is used, though I don't think that would make a huge difference.
The good news is, you just might end up with both showing in the SERPs.
Pages on a high DA site will usually build authority easier than on a low DA site.
The reason people will build links to profiles, etc is to improve the value of the links from the profile to their site. For example, BBB home page would be very high authority, but the page for "Joe's Website" is rarely visited and relatively unknown. Building a bunch of links to that may seem odd, but will often help that particular page build authority, thus making the links to Joe's Website more helpful.
Why not just build links directly to your target? I'd say why not do both? Check out this article that has some info on how links are valued, among other things: http://www.seomoz.org/blog/understanding-link-based-spam-analysis-techniques
Noindexing the syndicated articles should, in theory, minimize the likelihood of having a Panda problem, but it seems like Panda is constantly evolving. You will probably see some kind of drop in rankings as the number of indexed pages of for site will decrease. If you have say, 1000 pages total on the site and suddenly 900 are taken out of the index, this might be a problem. If it is a much smaller percentage of the site, you might not have a problem at all. Other than the number of indexed pages, I don't think you will have a problem once the syndicated stuff is noindexed.
It will probably take Google a while to re-index/un-index the pages, so hopefully it won't be a fast drop if there is one. In the long run, it is probably better to at least have the appearance of trying to do the right thing. Linking to the source, and maybe using rel=canonical tags to the original article would also be a good practice.
Why don't you ask him what people who would benefit from his guidance would search for, and go from there? I am always a little surprised when clients ask me what would be good keywords for their service or product. Shouldn't they have some idea what their potential customers want?
I usually answer with a question like "what does it do?" or "what would you ask for if you went to a store looking for it?"
Aside from that, you might try:
enlightenment
spiritual guidance
and of course "awareness"
Is there a quote or saying for which he is famous? People often remember a phrase but not where it came from and Google it.
For your on-page strategy that you have described, ask yourself how each part of that benefits the reader. If it really doesn't benefit the reader and you are only doing it because you believe it might help SEO, then skip it.
Put keywords in the back of your mind, not the front when writing titles. Sometimes, that means not having the keywords at the front of the title as well.
I looked at the example page you gave and the title looks a bit repetitive: "NLP Practitioner Training, NLP Practitioner Certification" To a searcher, that doesn't look very appealing in the SERPs. The repetition of "NLP Practitioner" could even keep the page from ranking as well as it could. Try something a bit more human, like "New NLP Practitioner Training Locations in California & Utah". That's more people-friendly, and if I gathered correctly from a quick skimming of the article, is a better description of what is on the page than simply two keywords stuck together.
So think firstly about what a person will think when they read that title in the SERPs, THEN worry about where you can fit your keywords into it. Really, this is how you should think about ALL of your content, too. Write about the topic, and make it your priority to convey your message well to readers. If you write about it effectively, you will probably use your keyword enough times. The idea is that instead of worrying so much about one or two keywords, you will be able to to get more visitors who were searching for a variety of things related to your topic, in addition to ranking better for that keyword. Not to mention you will run less of a risk of over-optimizing. If you haven't seen the Blueprint yet, do check it out: http://moz.com/blog/how-to-rank
The part about "Dream Your Theme" goes into more detail about the way optimizing for a theme or topic works. I have also touched on this idea in a few different articles: http://kercommunications.com/tags/topical-optimization/
OK to save you the trouble of reading Google's official guide that would very likely point you in the right direction, here are a few things to check that could explain a sudden drop:
Content quality and originality - are these pages 100% original? Have you examined the technical side of your SEO with a tool like SEOMoz's On-Page grader?
Also, are the links to your site from good quality sites or are they from link farms, low quality directories, auto-blogs, etc. As Google continues to adjust Panda, many sites have dropped in the last week or so. Even if you are not directly hit by those changes, sites that link to you may have been which means those links have less value now.
BTW - everyone should re-read Google's SEO starter guide. It is remarkable how many people think they understand SEO, but have gotten their information from bad sources.
You should use a XML site map to keep Google up to date with new pages. I could not find one for your site. Otherwise, if the event pages can only be found by using the search feature on your site, those pages will not probably not be crawled and indexed. you could also submit the feed to RSS sites Fetch as Googlebot may work, but it probably will not be as fast as using a sitemap.xml file.
Would it be possible to have the event pages available through some kind of navigation in addition to being found by your site's search?
You might also consider setting up an RSS feed of the events and submitting it to feed burner and other RSS sites. That may be a little complicated, but would also help speed up indexing.
Unless you are planning on "un-optimizing" your site for the keywords you are currently targeting and changing the entire focus of the site, targeting new keywords by making sure they are present on the site, or in your off site link building efforts should not have any negative effect on your current rankings.
Also, unless the user has disabled it in their browser, Google does attempt to remember the sites you visit and what you like. So in theory, returning users may even see your site in a higher position than they previously did.
In the old days of the WWW (like 10+ years ago) sites often begged users to bookmark the site. I don't think you want to go on that path, but relying on search traffic for customer retention is probably not the best way to handle it. Offer newsletter subscriptions, and definitely use Facebook like, Google +1 and Twitter follow buttons. That way when users search the same phrases, they will definitely find you again if they have hit Like, +1 etc.
Search is social - especially now that Google has crippled user query tracking.
I would use canonical in this situation. That way you won't have to rework all your navigation to have a single url for each product and can keep your category structures intact.
This is a situation that many eCommerce platforms really don't handle well. For example, I know of one that claims to offer canonical for product pages, but it gives each version (boxes/product.html and circles/product.html) its own unique canonical tag rather than both referring to the same product.html page. Kind of misses the point.
In Webmaster Tools, go to URL Parameters under Site Settings. You will probably see some of the parameters you want to block in the list. Click on Edit next to a parameter you want to block like "?app=members" and then choose the appropriate settings to prevent Googlebot crawling.
I don't know if that is the preferred way of doing it, but that should block those dynamic pages.
You could try using https://www.google.com/webmasters/tools/submit-url?continue=http://www.google.com/addurl/ to submit the URL to google, but if the page can only be found by using their locator, it may not get indexed since it is not linked within their site. Worth a try though.
One static URL that can be found by your users from within any of the categories that the product is in would probably be best. That way there is no chance of duplicate content issues if the search engines were to find both and not resolve the canonical tag.
Really, I think it could go either way. Whichever one is easiest to implement in your particular situation, unless there are a lot of inbound links to your products. If that is the case, changing the URLs would require 301 redirects from the old URLs.
Not sure if it was a connection issue on my end or what, but that page takes a very long time to load, which could explain the lack of indexing of the pages linked from it.
Also, Google states that pages submitted witht the Fetch as Googlebot tool are not guaranteed to be indexed, so there may be quite a delay on that. Are all pages included in your XML sitemap? An XML sitemap is the preferred way to notify Google of pages it may not normally find. Here is a link to more about XML sitemaps https://www.google.com/support/webmasters/bin/answer.py?answer=156184&hl=en
Even with an XML sitemap, Google may not immediately crawl many pages. Actually, indexing is rarely immediate. The frequency of crawling and speed of indexing has to do with many of the same factors as your ranking - quality, number of inbound links and pagerank, site performance, etc. If all your pages load quickly and you are in pretty good shape as far as links, etc, you could also try something to draw Google's attention to the new pages - like Tweeting a link or posting to Google+. That seems to "force" faster indexing in some cases.
I just checked your site with webpagetest.org and it is showing a load time of about 14 seconds. Tools.pingdom.com seemed to get hung up on some of the javascripts and couldn't complete its test. Doing what you can to speed up the site and address any other "quality" issues will help with indexing, and your performance in search engine results in general.
It is also there so users can block a site they just don't want to see anymore. It isn't related to your site specifically but is simply a "feature". There is nothing you can do about it aside from making sure your site isn't something anyone would want to block. If a user choses to block your site, they won't see it anymore. It won't effect what other users see unless several people block your site, then it may cause lower rankings. As far as I know, nobody has determined exactly how man people must block your site for it to effect search results for other users, but Google does claim to use that info.