RE:
Changed page name, etc - still get a "missing canonical tag" error. At this point, I could be wrong, but I am convinced there's hidden problem, or a bug in the system.
No more posts here, I emailed help@moz.com
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
RE:
Changed page name, etc - still get a "missing canonical tag" error. At this point, I could be wrong, but I am convinced there's hidden problem, or a bug in the system.
No more posts here, I emailed help@moz.com
What I ended up doing is just changing the page name from default.asp to "fedex-routes-forSale.asp", changing the navigation links and setting an .asp code page to redirect. Effectively skirting the default page canonical syntax conundrum.
This also cleared up the duplicate content issue as well - default vs root - www vs root. etc.
I'm still curious as to the dynamics of this issue, but opted for a nice, tidy PDF for my client, with all issues solved.
I'm getting two "no canonical tag" errors for the default page of a sub-directory default page (www and root) - again NOT a subdomain.
Since the page is not the root of its own site, I tagged it as --
I have tried without the default.asp, but the error remains. Been doing this for 24 years and don't remember running across this before.
I just "test graded" a TOTAL dog of a page (wordpress site created by a print house) with NO SEO concept at all - main keywords nowhere! No tags.
It amazingly got a 69%. That's like grading an army 69 out of 100 points for having boots polished and uniforms ironed - even though they had no weapons or ammo.
The old version used to measure basic SEO, which is common to mobile, tablet and PC. You could instantly get a fix on where you went wrong. Now you have to weed through peripheral data to get to the meat. NOT an upgrade.
One cost effective benefit of using the root domain (no W's) is in setting up a MOZ campaign. If you set it up as a sub domain - www.sales.com - MOZ will track that version only. If you happen to have sub domains connected to that site - the performance of those keywords will not be tracked.
Setting up your campaign with the root domain, sales.com -- auto.sales.com, homes.sales.com, insurance.sales.com, ad nauseam, will be all tracked within a single campaign slot.
This code ratio metric is merely one of many issues to trigger a spam "conviction". It would seem logical that if you write junk-free code and raise few to no other flags, it should be ignored.
Surely if lean code alone were a violation - there would be no Google page speed tool!
Jeremy - you appear to be "The Man" here. A very good answer to a less than obvious solution.
I assumed that the link was appended to whatever else I wrote - like an email signature.
Now I know, problem solved - Thanks!!
Exactly!! - this post is supposed to be in the spirit of a feature request.
" if you're running a campaign for a client and they want a change, you could make that change."
exactly - they shouldn't have a misleading link saying THEY can make the changes
"it's unwise to give multiple people direct access to a login for any analytics platform."
exactly- there should be no link telling the client otherwise.
"If this was sent to you (presumably the Moz user),..."
__exactly-I copy myself on client reports, to verify they go out - otherwise I would not be aware of such verbiage.
I already know I can edit the report - I put it together!!
"I think the intention is that you'd download the report and write your own email."
See below - why is the client offered the (false) ability to change report settings?
To make changes to this report, simply edit your report settings.
Happy optimizing,
Moz
PS - to all those wondering what the amount of stars under a poster's picture means......the more stars they have = the less paid work they have & more free time they have to hang out and yak!!
Ergo - the less stars you have, the busier you are!! Right, Michael ,-]
Sometimes you just have to trust your gut! Why would Google create the enormously nit picking Page Speed Tool - then penalize you for following their dictates? I draw a line at certain reasonable levels - and still rank in the top percentiles. If I were new to the game I would be torn between thinking I needed to cut more code for Google - or fatten up to pass Moz's spam-ometer!
The bandwidth load on Google has grown to sublime proportions - it is in their best interest that we pare down the code - to counteract the monumental glut of user generated fluff.
http://search-engine-upgrade.com/google-data-center.htm
Moz tools have made me look like a guru to my client base - BUT - since I know I am not a spammer, I will choose to ignore this spam score and follow what Google clearly encourages.
I have likewise noticed this on several sites - that do very well in Google. Some sites were near or at the top of Yahoo page one have fallen suddenly & drastically!
Specifically, two smaller (what I call) "magnet sites" or "satellite sites", tuned to narrow band topics, and linked to the main site which also contained those topics. They were both number one, page one - and beat the main, large site on their topic.
The net result is the two magnet sites fell back but Yahoo made the "mother-ship" number one in both categories. They "givith and taketh away".
I too noticed a large increase in ad-space real estate. Since there's so much less room for organic results, they are obviously quashing multiple SERPS from the same outfits. I have noticed a similar result in Google years back - a satellite site will virtually disappear, but the main site goes to number one.
Damned if you don't - damned if you do - what a nutty game! This is a total surprise. I have been building websites since 1994 and I continued building lean code when bandwidth went up. I don't use CMS - all hand coded asp or, less frequently, PHP. My newer sites are all in the 90's in page speed.
Now I am penalized for this? Shouldn't that message say "Congratulations, Site Mark-up is** Exceptionally **Small"?
RE: http://seUP.net/sell_business-optimization.pdf
This is confusing - above is a link to a branded (by global setting) page grade report I ran, dated October of 2012. I used them all the time to show a prospect how bad his SEO is, like the linked one above. It was a great tool in landing clients. I was a major crusader in the branding issue back then.
NOW - we can only brand them on setup campaigns, and are restricted to keywords being tracked. That is a huge downgrade in service.
I have also noticed a couple of older sites dropping off the "Bing/Yahoo" charts, all the way across the board, and ranking very high in Google.
I have found that uniform across the board drops like that always indicate some serious breech in SE policy, and/or penalties as a result. Mere variances in priorities can't account for a 50+ position difference. I have noticed this in the last few months or so.
There is something much more to it than just local listings. It is the most unusual and disturbing trend I have ever noticed. I agree Bing WT's are a good place to start, but I haven't yet found the culprit either.
Don't trust a quick answer!
I rebuilt a site and got it back in the running with Google a year ago. They have contracted nothing done since then. It has dropped about 9 or ten points in Google - but recently fell out of the top 50 in Bing/Yahoo. Once out of the top 50 - who knows where, off the map, they are!
I have noticed other sites that have dropped significantly from formerly similar rankings to Google.
In Many Cases: Google stays up - Bing/Yahoo goes significantly down
@[Vadim Mialik
I](http://moz.com/users/view/145337) think your "familiarization with the platform" is the answer. I know how to optimize - just not in Wordpress or other blog tech.
RE:"what you mean by serious SEO"
I presently rebuild pages to spec, in asp, rather than try to un-corrupt old code. I rename pictures, rename page filenames, edit the existing copy, etc. IOW, I build the pages around the SEO target. As I write the code, I make sure everything "fits" together. I have been getting excellent results this way.
If I start retrofitting Wordpress sites, I won't have that kind of access to customize.
I don't agree that Wordpress is inherently search friendly, but can imagine that demand has forced them to come up with workarounds.
I think Vadim has the right approach - learn the mechanics of what can be manipulated in Wordpress - then learn what I need to make up for what I can't do - my way.
I will definitely take a closer look at the Yoast plug-in.....Thanks for all replies
I am a longtime practitioner of classic asp. I build sites with SEO built-in and administer largely to sites I have built.
I have, up til now, had a policy of not working (SEO) on Wordpress projects because -
I have a long time client who wants me to do contract SEO work on various Wordpress sites on his servers.
Can someone point me to a definitive source on the latest methods for Wordpress SEO. I am very proficient in SEO with conventional web sites - I just need to know how to implement it in Wordpress.
I don't see how plug-ins can implement serious SEO, but my mind is open.
Ryan - I think you have sincerely tried to relate the world as you know it and I appreciate your time. I will leave this final thought on the subject of popular assumptions -
In html 4 the W3C denigrated target="_blank". If you used it on a page, your page wouldn't validate W3C. Reason given - their opinion that it took away visitor choice in how many windows were open. In the designer's view, NOT using it for external links simply took away visitors. As Google and just about everyone else continued opening new winows and the W3C could not give a solid, technical reason to not use it - they relented and re-included it in html 5 specs.
"Web 2.0" - commonly believed to be an official standard is nothing more than a phrase coined in a 1999 article by a consultant on electronic information architecture envisioning the user involvement we see today in places like Facebook, etc. People building Wordpress sites, etc now claim they are operating in "Web 2.0". There is no real Web 2.0 construct.
So far no one in a position of power has stated anything concrete that they are sure that (tastefully) hyphenating a domain name is going to have a negative effect on SEO.
Again, I am referring only to conventional websites - not blogs. And, why should Google worry about me with URL's like this out there - http://www.tampabay.com/news/business/casino-legislation-would-create-three-new-gambling-venues-in-south-florida/1195490