We've seen one of our sites jump from low 40's to 11 overnight after months of being low.
We're UK based as well, more a directory style site then e-commerce.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
We've seen one of our sites jump from low 40's to 11 overnight after months of being low.
We're UK based as well, more a directory style site then e-commerce.
Changeing any url to any other url without a proper 301 on the old url can have adverse effects, as old links then go to 404 pages. I wouldn't expect however there to be a gain or loss for that inconsistancy in your url structure, save in people linking to .html by mistake.
Also as a slightly less SEO related note, for best ux, try getting rid of the .filetypes entirely, and in the homepages case rid of the index too. This mostly helps with the homepage, reducing the odds of needing a 301 for the link to reach you correctly.
Looking at the footer of your website, I see "Powered by Communications3000
C3MS
"
From a quick look of the site it links to I'm guessing it's something your webdevelopers have put together rather then an 'Off The shelf" cms so to speak.
I would recomend you find an SEO who knows both SEO and some .Net who could communicate with you and your developers to see if it is indeed impossable to do a 301, or if your developers are making excuses/misunderstanding the request/e.t.c.
That would probably give a better understanding of the situation.
As everyone else has said it /does/ sound rather odd that a 301 can't be implimented between the various ways of doing it.
.NET is not a CMS system. It is a language your CMS is written in. (As in you can have the same menu in english and also in french, you could in theory have the same CMS in .php and .NET).
You'd need to give the name of the CMS if you want advice on it's suitability.
Honestly, your developer should be able to 301 from URL a to b without loops based on the info provided so far.
We don't have a view all page(We found them so slow, so long, and with so meny links we had a notable improvement in rankings in general when switching to the quicker paginated versions). And other then the first page none of the other pages are currently in our site map.
I'm not entirely sure how that would stop gwt flagging it as a duplicate meta though. Less you imply to also no-index them.
From a navigation point of view, being able to erase the end of a url and end up at a parent is excelent for UX. As is not having to recall a file type (the .htm)
It wouldn't thus entirely surpise me if google favoured such a structure.
I expect however, google infurs such relationships from your onsite interlinking more. - Breadcumbs for example would probably have more effect. (I do belive there is a markup for them in webmaster tools, or at least was one being beta'd recently)
I personaly wouldn't do such a change less there were other issues being fixed at the same time. (Improving UX would count). Make sure of course to do you 301's and change the internal links if you do.
That scale of unique descriptions is well beyond our capacity. We're actually considering dropping the number of items per page too.
Thanks for the help.
Could ignore cause any problems? (such as pages that should/shouldn't be indexed) I was rather suprised to discover that using cannonical wasn't enough.
I'm currently working on a site, where there url structure which is something like: www.domain.com/catagory?page=4. With ~15 results per page.
The pages all canonical to www.domain.com/catagory, with rel next and rel prev to www.domain.com/catagory?page=5 and www.domain.com/catagory?page=3
Webmaster tools flags these all as duplicate meta descriptions, So I wondered if there is value in appending the page number to the end of the description, (as we have with the title for the same reason) or if I am using a sub-optimal url structure.
Any advice?
In that case, I've seen a few people try it with no notable diffrence. Pre-Penguin there were a few cases here were removing several instances of a keyword in the body seemed to dramaticaly improve rankings, but thats more removing keyword stuffing then optimising your page to apear unoptimised.
Right now, if your keyword can be there and it reads naturally, then I don't see much reason for it not to be there. In contrast, if you whole page is about blue widgets ad the heading /doesn't/ include blue widgets, you'll be confusing people. People also link using the heading/title occassionaly, so you should pull off a few genuinely natural links with that heading.
At least as far as penguin goes, it seems much more link anchor oriented right now.
Considering having h1 as the pages main heading and using h2-h6 for sub headings is proper html (or multiple h1s and sections in html5), I'd never stop doing it in hopes of getting an SEO advantage that may or may not lost with algo updates.
Most sites at the very least have a h1 as their main heading, theres nothing over-optimised about it unless you then keyword stuff it or something like that.
Basicaly, using a h1 for your main heading isn't an SEO tatic, it's what it's actually for.
For the most part, it borders dangeriously close to cloaking, and also runs the risk of IPs being very failable in terms of determining a location.
On the otherhand, some very minor text changes, with a default, would probably be fine if they improve the user experiance. Remember that google will only see the default version, so you're not getting any potential location based ranking.
If it's on a larger scale, I wouldn't bother and would work on loaction specific landing pages instead.
I would prerform the serch yourself in a chrome incognito browser and just looks at the serps.
Is there for exampe, a pack of 6 results taking up that first slot, essentaly making you poition 7+?
Are all the other results dis-similar to you? In which case you might not be what people are searching for.
If your getting impressions, are in the right slot, and theres nothing else odd, compare your meta description to the number 1. People might just be looking at you and thinking you are providing somethign else.
Just make sure you pick the version with the most links, and that your internal link structure, google webmaster settings e.t.c. are all the same and it will most likly at worst do very little, or at best help somewhat. Especally since you seem to have no external links using the other formats.
It also tends to make your data a lot cleaner which always helps.
Meny directories are still providing link juice, and it tends to be the least spammy ones.
I would say the best rule of thumb is any directory you would sign up due to the potential value of traffic from it is going to be relitivly safe to go for as it will be providing real value and thus less likly to be crushed by google.
Directories for links sake is more risky.
I would presonaly allow the catagorys to be indexed, but make sure each section had a block of text as an introduction so it's not very thin or near duplicate content.
Then theres making sure no unwanted urls or pages get indexed. The search pages, user pages e.t.c might take some looking into to make sure your not egnerating 1000s of duplicate pages.
And it's just a quirk, but the url's read weirdly with forum.php in them. Rewriting them to just /forum/ might make them a little more usable.. Not entirly sure you will get SEO benifit out of it though.
If any of those sub pages had links, are ranking e.t.c., then you're definatly going to have to look into 301s at the very least.
Any more then that I would think it best to give a more specific example or link.
There's the manual request in Web Master Tools.
Though that they're indexed and not the 301 of the page seems odd. Make sure there is a crawlable link to each page somewhere in your site, perhaps even make sure they are still in your sitemap, and not blocked by robots.txt. That should allow google to re-crawl the pages and realise they have been 301.
Also check how the 301 is implimented. Make sure theres not some kind of masking that is redirecting users and not google. Also make sure it is a 301 and not a 302.
ISS has no problems doing 301s, and if you can use php, asp or anything similar you can just manualy put a 301 on each page if that fails.
No rel-canonical solution will result in all 3 sites ranking as far as I am aware.
Your best option is usualy one site with geo-located pages. If it has to be 3 sites, then the only real option is to make all that content unique, on unique ips e.t.c., which at the end of the day is 3X the work or more.
If you are going to do it, I would only do it with the link as your url or perhaps brand name. If it causes a massive exact match link profile you risk flagging as manipulaitive.