You will have trouble getting any site listed in DMOZ!
Not saying it's impossible, but the last 5 sites I've tried to get listed have failed miserably despite following their guidelines definitively.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Owner/Developer
Company: PerfectWeb Creation
Website Description
Web design and marketing specialists focusing on the charity sector.
Favorite Thing about SEO
The blog!
You will have trouble getting any site listed in DMOZ!
Not saying it's impossible, but the last 5 sites I've tried to get listed have failed miserably despite following their guidelines definitively.
I have gone through this same process recently.
FireBug's NET tab, the page speed plugin and Yahoo YSlow plugins were the way to go.
I had a page load time of about 10 seconds before optimising. I reduced this to about 6 seconds by doing the following:
I then managed to decrease this further to about 1.5 seconds by changing server hosts.
If the current site is built on asp the credentials to access the database must be stored somewhere within the application. It would usually be in a .config file. There is a small chance that the connection details would be encrypted - but in the days of ASP this was the exception to the rule. If you can find these connection details you will be able to get at least a read only view of the database and export the data.
I find it strange that the owners of the site (whether that be you or a client you're working for) don't have the connection details of the db.
If you really can't get db connection details then yes, scraping the article and importing into the new db would be the best way forward. Screen scraper basic edition should suffice for this (it's free) - http://screen-scraper.com/download/basic_choose_platform.php
If the site was built 7 years ago, and has had no work done on the programming side since, there is a good chance the database is not up to scratch and has lots of dirty data (not for sure, just a good chance).
There is also a good chance that it was created with a custom CMS rather than one of the more popular CMS's so bare that in mind when trying to work out which one was used.
If I was faced with this situation, and I knew little about the old system, I would do the following:
Once you have the new site up an running (in a test environment) you can think about how are going to redirect pages by using tools like opensiteexplorer to examine the most linked to URLs and then use urlrewriting (rather than redirection) to do the mapping (i assume the new site will be on the same domain).
Given that you've said your ultimate aim is for users to purchase things from your affiliates shop, you are really measuring the wrong thing by measuring page impressions.
Imagine a scenario where all users arrive to your site at an article page. At the end of the article page there is a link to buy a product. This link takes the user to your affiliates shop where he/she then buys a product. The user has visited exactly one page on your site (1 page impression) and earned you £x in revenue.
The above example would be more valuable to your business then say if they landing at the same article page, read 5 other related articles and then purchased nothing.
If the website makes money through selling affiliate products, you probably want to be measuring the number of sale made vs. the total number of unique visitors (conversion rate) or the number of pages a visitor reads before purchasing from you.
Don't measure the wrong statistic. make sure you know what you are trying to achieve and measure that!
Q&A sites tend to be more SEO friendly because of things like "related questions", "tag clouds" etc.
They tend to work more like blogs with comments rather than forums.
As the articel suggests, more people are searching using questions nowadays - e.g. "where can i get an iphone" as opposed to "buy iphone". Q&A sites lend themselves to ranking highly for this type of query.
The article also suggests, and I'm inclined to agree, that Google likes to show content from different data sources in it's top positions. Given the web trends towards Q&A sites, my personal views is that Google are more likely to rank a Q&A site higher than a forum (given all things are equal).
Yes, there is a very good chance that this is negatively affecting you.
I can't answer all of the questions, but I can help you out with some of them:
1. Given that it's national newspaper, the PR of the site should be fairly high, and this should flow through to the article page. Building up the links to the article page seems like a lot of effort that could be better spent getting more quality links to your site, but i can see that this could have some benefit.
2. The article will certainly end up in the archive. it is unlikely, hwoever, that it will be deleted. Think of it from the newspapers perspective for a second. What benefit would it bring THEM to delete the article? None. These types of sites want great quality content that receives inbound links - deleting pages that could potentially have inbound links would be a terrible SEO strategy.
3. Only if it redirects and forwards properly (see next question)
4. Yes, but you will lose some of the link "juice"
5. I'm not sure on this one. I would like to think this an evolving process, but I don't have any facts to back this up.
6. Seems like it could be a lot of effort if you were to do this for every article you get published that links back to your site. You will only be boosting the PR of the site hsoting your article, only a small proportion of this will eventually get back to your site. You could spend this time and effort getting quality links to your site.
7. You will receive some link juice. You may find conversions are higher after people have read the news article.
Using the erl=canonical will not affect the ranking of the main page, in fact it will boost it. Any links that are generated to any of your country specific landing pages will in fact pass their link juice on to the main page specified in the rel=canonical.
I would recommend you keep your 250+ landing pages. Having these pages will also help optimise your conversion rate as users from those countries will be able to identify easier (after seeing their countries flag etc.). If the paegs are not optimised for country specific terms, I would also recommend you look at doing this as well, not necessarily for SEO benefit, but for CRO benefit.
In terms of SEO, they carry little weight and in my experience they're not really worth the time you need to spend writing them.
However, if you're planning on submitting your site to any link directories, it is worth noting that meta keywords are sometimes used by the directories to help them organise and rank your site.
Using the erl=canonical will not affect the ranking of the main page, in fact it will boost it. Any links that are generated to any of your country specific landing pages will in fact pass their link juice on to the main page specified in the rel=canonical.
I would recommend you keep your 250+ landing pages. Having these pages will also help optimise your conversion rate as users from those countries will be able to identify easier (after seeing their countries flag etc.). If the paegs are not optimised for country specific terms, I would also recommend you look at doing this as well, not necessarily for SEO benefit, but for CRO benefit.
I would certainly put alt tags on all images for accessibility and usability reasons. The SEO impact of alt tags on images is minimal, but I've never heard of alt tags diluting the SEO success - it sounds ominous to me.
I would be wary of being "too spammy" with your alt tags, as it is entierly possible that googlebot might pick up on this.
It is important to note that good descriptive alt tags (with your keywords) and keywords within your iamge fielnames will certainly help your images rank better on Google Image search. This may or may not be an alternative vertical source of traffic to your site that you may or may not want - really depends on the type of site you're running.
Google will not penalise you from getting links from different niches. The links will not carry as much weight as if they were from relevent content.
Google will penalise you if this is done in a spammy way. For example, if you all of a sudden receive hundreds of links from restaurant sites, where there is no context to your content, Googlebot is going to find this strange and potentially raise a flag against your site.
Given that you've said your ultimate aim is for users to purchase things from your affiliates shop, you are really measuring the wrong thing by measuring page impressions.
Imagine a scenario where all users arrive to your site at an article page. At the end of the article page there is a link to buy a product. This link takes the user to your affiliates shop where he/she then buys a product. The user has visited exactly one page on your site (1 page impression) and earned you £x in revenue.
The above example would be more valuable to your business then say if they landing at the same article page, read 5 other related articles and then purchased nothing.
If the website makes money through selling affiliate products, you probably want to be measuring the number of sale made vs. the total number of unique visitors (conversion rate) or the number of pages a visitor reads before purchasing from you.
Don't measure the wrong statistic. make sure you know what you are trying to achieve and measure that!
I agree that directories would be a better way to organise your content.
I would aim to get people to use www.domain.com/uk etc. as a linking source, but potentially still use www.domain.co.uk in offline marketing and use 301 redirects to www.domain.com/uk. If that makes any sense?
The TLD will certainly have offline local value even if it doesnt have SEO benefit.
There are possible a few reasons for this:
1. The user is filling in the contact form, but then decides not to send it
2. The user is filling in the contact form, but then can't send it because of an error with validation or something server side
3. The user is filling in the contact form, and is sending it (you don't specifically say that they don't take any action on that page)
4. a percentage of the users will almost certainly be looking at the contact page for social proof - is there an email address I can copy, is there a physical address, do they have a phone number, is there an easy complaints procedure etc. - these people will then often navigate on (back to the homepage or another site page)
5. People might want to see where the company is based and then decide that the company is not relevenat to them because they are from a different, country/state/country/town/city/etc.
Plus, I'm sure there are a million other reasons.
I recommend installing ClickTale on the site. The free version of the product should be sufficient to gather some video recordings of what the userts are actually doing on the page. This is the best way to see what the user is doing (but not a great way to know what they;re thinking! - although you can make some educated guesses). Check it out at http://www.clicktale.com
Of course it is possible, although it seems somewhat unlikely. Depends on the size of the site, the type of the site, their social media profile, how good their PR team are, what relationships they have with said review sites, magazines, newspapers etc.
I too would be a little suspicious if the links are very keyword rich, this suggests some engineering on the part of the site. Not to say they have used black hat techniques but maybe they have urged the comapnies to use certain anchor texts etc.
Without knowing what site you're talking about it would be difficult to guess efforts they may have made to accomplish this.
If the site was built 7 years ago, and has had no work done on the programming side since, there is a good chance the database is not up to scratch and has lots of dirty data (not for sure, just a good chance).
There is also a good chance that it was created with a custom CMS rather than one of the more popular CMS's so bare that in mind when trying to work out which one was used.
If I was faced with this situation, and I knew little about the old system, I would do the following:
Once you have the new site up an running (in a test environment) you can think about how are going to redirect pages by using tools like opensiteexplorer to examine the most linked to URLs and then use urlrewriting (rather than redirection) to do the mapping (i assume the new site will be on the same domain).
Hi Boson,
Google does automate this process and does use refrences to your site from around the Web. it also takes into consideration any DMOZ listings (by default). This is confirmed here http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35264.
I would recommend that you check your DMOZ listing (if you have one) and if that is wrong, attempt to get that changed, or use the NOODP meta tag described in the article.
If that has no impact you should double confirm that everything is OK in your title and meta title tags on your page (e.g. there aren't any duplicate tags).
If that doesn't bare any fruit, then your next step would probably be to try and contact some of the people that are linking toy ou with a typo and ask them to update their links - this si far from ideal though, so only do it as a last resort.
They are basically the same. They use the ODP (Open Direcotry Project). The Google directory apparently uses Google's pagerank alogirthm within the categories.
When searching for Google Directory on Google you can see this as the site description:
**_"Google Directory - _**Searchable directory based on the ODP, combined with their own PageRank algorithm within each category."
I currently work as a software developer at the Bank Of England and also own a small business that helps uk driving students find earlier driving tests.