Hi Marina,
If I understand your question correctly, you just don't want your Tumblr blog to be indexed by Google. In which case these steps will help: http://yourbusiness.azcentral.com/keep-tumblr-off-google-3061.html
Regards,
George
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Marina,
If I understand your question correctly, you just don't want your Tumblr blog to be indexed by Google. In which case these steps will help: http://yourbusiness.azcentral.com/keep-tumblr-off-google-3061.html
Regards,
George
Hi Carly,
It needs to be done to each of the pages. In most cases, this is just a minor change to a single page template. Someone might tell you that you can add an entry to robots.txt to solve the problem, but that won't remove them from the index.
Looking at the links you provided, I'm not convinced you should deindex them all - as these are member profile pages which might have some value in terms of driving organic traffic and having unique content on them. That said I'm not party to how your site works, so this is just an observation.
Hope that helps,
George
Sites that have algorithmic or manual penalties can still have sitelinks. I have no idea if your site has a penalty or not, I'm just saying there's a real risk from looking at your backlink profile that you may have an algorithmic penalty now, or will get one (or manual penalty) in future. It depends on a number of factors, including the accuracy of your disavow.
If you're starting to rank well for competitive keywords, then that would be a case for staying put. Your visibility is yet to update in SearchMetrics, which is still low though obviously jumped when you launched the new site.
If you moved to the .net, you would have to give up on the .com, as redirects from it would pass the toxic link equity to your new domain. In a worst case scenario, if you stuck with the .com and build lots of quality content, got links and promoted the brand, then your bad link past caught up with you then it would be very difficult to cut free and move to a new domain.
That said, if you moved to .net now there's equally no guarantee that this is completely necessary (as you said you're starting to rank). You're clearly aware that It's a huge decision to make and not one that you would want to take lightly.
Since you're at the point where you're just setting up a new site and probably about to pump money into marketing it pays to be aware of the options. I don't have enough information to say which one you should take.
Hope that helps,
George
Hi again,
The horse may have bolted on this particular issue, but here's what I would have done in your position:
If there's no existing traffic to the domain that you want to keep, and the .com isn't critical to the branding (it's not in your logo) then personally I would have put the site on a another domain that you own already (e.g. moneysite.net - assuming that is clean) and just killed the .com.
Having fought through a few Penguin penalties for existing brands, I can't imagine anything worse than launching a new site/brand that has someone else's dirty link laundry attached to it. There's still a chance you might get a manual penalty in future which will hang over like an axe.
It really depends on how much resource you have to start building real quality content that gets links and shares, and keep ontop of your disavows and potentially ongoing link removals.
It also depends on how critical organic traffic is to your business. If you have $50K a month to throw at PPC or affiliates then it may not matter.
George
Your site appears to be indexed OK, but your visibility is low. I checked that "money site" is a low competition keyword you should be ranking better for.
Taking a look at your backlink profile (opensiteexplorer.org), it appears that there are a ton of toxic links pointing to the domain. This is almost certainly going to affect your rankings through Google Penguin, unless someone's already gone through a stringent disavow process.
Before you launched a new site on this domain, was it vetted to see if your predecessors had done any link building badness?
George
I would throw HTTP 410s for them all if they don't get traffic. 410 carries a little bit more weight than 404s and we're not talking about a small number of pages here. I wouldn't redirect them to the homepage as you'll almost certainly get a ton of "soft 404s" in WMT if done all at once.
Matt Cutts on 404 vs 410: https://www.youtube.com/watch?v=xp5Nf8ANfOw
If they are getting traffic, then it'll be a harder job to unpick the pages that have value.
George
Hi,
You'll need to provide the site details if you need help in diagnosing a penalty.
As a starting point I would log into Google Webmaster Tools to see if a manual penalty has been applied, I would also look in Analytics to see if your organic traffic overall has dropped across other pages on your website.
An algorithmic penalty is harder to diagnose, but can usually be recognised by aligning traffic drops with dates of Google algorithm releases.
George
Hi Monica,
It's almost certainly an issue related to the Backlinker plugin given that error message, though clearly it's not a straightforward solution. I found this post on the wordpress forum, perhaps this is your issue too (by member pee_dee):
"Look in header.php inside your current theme and find this line:
http://www.4llw4d.freefilesblog.com/jquery-1.6.3.min.js
This server is no longer able to provide the .js file linked to your theme. I found it mine at:
http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.6.3.min.js
Get a hold of the .js file (or google the heck out of the .js file you need) and point to it on your server."
Hope that works
George
Hi,
I see a couple of assumptions in your question - I would say that having a "keyword rich domain" is becoming a less significant ranking factor in SERPs so I wouldn't base the migration of an existing website that performs pretty well on the potential of a new domain targetting certain KWs.
Secondl assumption is that your existing domain is ranking purely because it's older. There are likely to be other factors at play here - particularly backlinks.
However, I realise that you need to restructure the website and moving to a single domain with the complexes on subdirectories makes sense architecturally. You might well see a drop in rankings certainly in the meantime while you do this migration so if this is a key acquisiton channel, then investigate PPC options to bolster your traffic.
As for the 301 - I agree it makes sense to 301 to the complex subdirectory for a user, however in Webmaster Tools Google doesn't support the migration of one domain to the subdirectory of another domain. This means it won't be as seamless as if you migrate to the root of the new domain.
One way around this would be to redirect the old domain to the root domain, but provide very clear navigation on how to get to the relevant apartment complex to a user. As far as a user is concerned, I would see this as an acceptable solution.
George
It looks like this error is caused by a plugin you have installed and enabled on your wordpress site that probably isn't compatible with the version of wordpress you're running. If you disable the Backlinker plugin it will probably go away.
As for SEO impact - it appears to also have mangled your /robots.txt (which you should fix), and the user experience of seeing this error is poor and so it's worth fixing.
George
Link wheels are a pretty old school tactic and Google Penguin (links) & Panda (thin content) stamped out the wide-scale use of them.
Here's what I'd do in your situation:
1. Report his website(s) to Google, giving as much information as possible. The more information you can collect on the link wheel sites the better: https://www.google.com/webmasters/tools/paidlinks?pli=1&hl=en
2. There's no point you disavowing nofollow links to your website as they aren't passing link equity. You should only disavow them if they are followed links - for example if he was trying to get you a Google penalty by making it look like you were part of a paid link scheme.
3. Have a look at the highest quality backlinks his websites have (open site explorer). Chances are he has decent links outside of his link wheel that you don't if he's ranking above you. Take a look at his domain authority to get a general sense of how strong his organic profile is.
4. Take a long, hard look at your own site, content, offering and backlinks and try to improve it. Can you create engaging content for your customers? Can you create a unique proposition that will make you stand apart from your competitors?
All in all, despite the frustration I would avoid agonising over any dubious SEO tactics being used by your competitors - so long as they aren't negative SEO attacks on you. If they're willing to take such short-sighted risks then they risk long term harm to their business.
George
A developer who tells you "W3C validation isn't important" is like a house builder telling you "Those small cracks in the walls are nothing to worry about"
George
Google has a policy for this - what you're doing is not advisable - you should be annotating the URLs. You can read the correct approach to take here: https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/separate-urls
Hi Jarrett,
Although the menus probably look different in your designs (an assumption on my part), the HTML looks identical on the link you provided (ULs/LIs). If the HTML is the same, then you'll use CSS to vary the appearance of them - specificially using the viewport on responsive mobile which is designed for exactly this scenario.
Perhaps I'm missing some other dev reason why it can't be done, but using ajax for this, even if you do attempt to block Google crawling it sounds like an over-engineered solution.
George
The official Google line would be to make them nofollow so it really depends on what your appetite for risk is.
In terms of whether your brand name is actually also a commercial keyword - Google it, and if you're ranking top then in theory it's being recognised as a brand.
In practice you will probably be able to get away with your brand name, or your full website address as the anchor text.
George
Some good responses already. I would add that if you're not already segmenting your audience then you definitely should be to make sure you're measuring the 'real' performance. For example, if in your 180k subscriber list, you have 90k people who haven't placed an order in 90 days and 10k customers who order with you every month then your open rates within the 'engaged' proportion of users will be swamped by the staleness of the rest of the list. Subscriber lists grow with growing businesses, and naturally develop dead wood so churning out the same sorts of emails means the stats can gradually over time decline.
You can combat this by (very simply) segmenting your 'active' base from your inactive base - by all means send them the same email but track their stats separately. Then when you start to invest in your emails, you'll be able to see if your active base are affected, rather than them all being lumped together with any increases/decreases in the performance of key sections of your customer base being concealed.
Finally in terms of email performance, I would use CTR/open rate purely as informative, because really it's revenue and margin that matter to the business.
Hi,
This isn't the best forum for this question as it's about IIS configuration - you'd be best off hitting up Windows Server configuration forums.
It is possible to do what you want to do. Dynamic (application) content in IIS needs to run under an application pool in order to be processed. You do this by creating an application under the website in the IIS manager.
Static content typically should sit under a different virtual directory (doesn't need an application). This means you can set it to be cached to improve page load time for users.
My advice would be to go back to the developer and look at his dev server set up, then copy it for your live server. Sorry it's hard to give you any more advice without a lot more information on the environment and code.
George
Hi,
Moz won't report this as an issue because there's no such detectable issue as having duplicate content on the same page. Duplicate content issues are discovered between two or more pages.
You could ajax the mobile menu, but given that google can and will crawl with javascript enabled, it will probably still come across 2 menus.
Personally I'd tell the developer to come up with a responsive solution. Looking at the markup, I'm guessing there will be at least some similarities between desktop and mobile menu experiences. Possibly a bit more painful in the short term, but worth it in the long run.
George
Thanks Max, your feedback makes complete sense.
KW volume analysis is a big job but managable, though I'm not even sure where I'd start with analysing whether people buy or not based on certain organic KWs. I'd probably have to set up Adwords campaigns and test conversion rates? Across a long tail of keywords that's going to be expensive to get statistically significant results.
Assuming that I don't have the resources to do that immediately, but that I do have a duplicate content issue (at least Moz seems to think so) am I better off "fixing" it with my proposed solution, or would you hold off until the KW analysis was done. This section of the site gets very little organic traffic at the moment as it's also a very competitive space and it doesn't have many inbound links so the risk of causing damage is low. I'm reluctant to start promoting this section and linking to it if I know there's a significant underlying duplicate content problem.
You're right about the URL too - it actually starts /Candy-Dispenser-Candies-Refills/*, I didn't think I'd get picked up on that!
Thanks,
George
Hi all,
I’m looking for some expert advice on use of canonicals to resolve duplicate content for an e-Commerce site. I’ve used a generic example to explain the problem (I do not really run a candy shop).
SCENARIO
I run a candy shop website that sells candy dispensers and the candy that goes in them. I sell about 5,000 different models of candy dispensers and 10,000 different types of candy.
Much of the candy fits in more than one candy dispenser, and some candy dispensers fit exactly the same types of candy as others.
To make things easy for customers who need to fill up their candy dispensers, I provide a “candy finder” tool on my website which takes them through three steps:
1. Pick your candy dispenser brand (e.g. Haribo)
2. Pick your candy dispenser type (e.g. soft candy or hard candy)
3. Pick your candy dispenser model (e.g. S4000-A)
RESULT: The customer is then presented with a list of candy products that they can buy. on a URL like this:
Candy-shop.com/haribo/soft-candy/S4000-A
All of these steps are presented as HTML pages with followable/indexable links.
PROBLEM:
There is a duplicate content issue with the results pages. This is because a lot of the candy dispensers fit exactly the same candy (e.g. S4000-A, S4000-B and S4000-C). This means that the content on these pages are the basically same because the same candy products are listed. I’ll call these the “duplicate dispensers” E.g.
Candy-shop.com/haribo/soft-candy/S4000-A
Candy-shop.com/haribo/soft-candy/S4000-B
Candy-shop.com/haribo/soft-candy/S4000-C
The page titles/headings change based on the dispenser model, but that’s not enough for the pages to be deemed unique by Moz. I want to drive organic traffic searches for the dispenser model candy keywords, but with duplicate content like this I’m guessing this is holding me back from any of these dispenser pages ranking.
SOLUTIONS
1. Write unique content for each of the duplicate dispenser pages: Manufacturers add or discontinue about 500 dispenser models each quarter and I don’t have the resources to keep on top of this content. I would also question the real value of this content to a user when it’s pretty obvious what the products on the page are.
2. Pick one duplicate dispenser to act as a rel=canonical and point all its duplicates at it. This doesn’t work as dispensers get discontinued so I run the risk of randomly losing my canonicals or them changing as models become unavailable.
3. Create a single page with all of the duplicate dispensers on, and canonical all of the individual duplicate pages to that page.
e.g. Canonical: candy-shop.com/haribo/soft-candy/S4000-Series
Duplicates (which all point to canonical):
candy-shop.com/haribo/soft-candy/S4000-Series?model=A
candy-shop.com/haribo/soft-candy/S4000-Series?model=B
candy-shop.com/haribo/soft-candy/S4000-Series?model=C
PROPOSED SOLUTION
Option 3.
Anyone agree/disagree or have any other thoughts on how to solve this problem?
Thanks for reading.