I found matt cutts video which says that keyword rich domains are being devalued.
I would recommend doing more brand awareness exercises rather than switching domain names for a more keyword rich domain.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I found matt cutts video which says that keyword rich domains are being devalued.
I would recommend doing more brand awareness exercises rather than switching domain names for a more keyword rich domain.
1st of all, make an image sitemap and submit to Google. Here are the guidelines to do so.
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=178636
Then, do some default tag and category definition. As Barry pointed out, it makes a lot of sense to categorize. That would add relevance to your website in searches.
Then, at the backend, assign default tags for the categories. Like for cars, the default tag could be automobiles, for scenery it could be nature so on and so forth. That would help in your optimization.
You can try a mod_rewrite on apache or if its not a big website URL based redierction on .htaccess.
Therefore, you can actually layer the architecture in such a way to actually write something like a vanity URL.
That would solve your URL problem.
I would say that golf clubs would serve you better than just clubs. Clubs can also mean a gathering place for folks with similar interests where you get clear results for the keyword golf clubc if you mention it in the URL itself.
Hi Laurent,
If its wordpress, I can think of a few plugins to help you there. SEO Smart links is a great plugin to help you in that respect. URL: http://www.prelovac.com/vladimir/wordpress-plugins/seo-smart-links
Then there is a popular post plugin which will showcase your popular posts in the sidebar. This is a great way to leverage you old/archived content.
1st of all, do you have https on your website ? It would be a good idea to disallow robots through robot.txt from indexing sensitive parts of the website (user login etc etc).
Then submit a website through a sitemap. ( You can also mention sitemaps in your robot.txt).
That should get the ball rolling. You shouldnt expect huge benefits over a short period of time, but it will slowly show up.
Shwan,
I have noticed that when you have a long URL structure with multiple folders, Google tends to lose "interest" in your deep pages.
Let me give you an example: If you have a domain called www.website.com and you have a category called gemstones. In gemstones, you have diamond as a subcategory and a solitaire as a page.
If you consider your homepage to have an importance of 1, you would not have a category page which also has an importance of greater than or equal to 1. So, your category page gets a page weight value...lets say 0.9. Now, your subcategory page is treated that same way and you give it a page weight of say 0.8. Now, your solitaire page gets a value less than 0.8. Now, if you cut out one or more levels in your URL, you have a better chance of assigning of a higher value to your page.
Now, coming to your question. Breadcrumbs are essentially meant to help your users navigate better. So, your website hiearchy (the folders, sub folders or categories, sub categories) should reflect in your breadcrumb.
So, keep your URLs short, but keep your breadcrumbs like your website flow.
The problem is very common for content heavy websites where content lies somewhere way down the hiearchy.
I am considering or assuming a few things here:
1. The webpage you are referring to is already crawled atleast once.
2. It is accessible from atleast one link on your homepage
3. It does not have a huge number of outbound links ..that is, around 100(within and outside your domain).
Your 1st task should be to get Google to crawl the page (s)
1. get a tool like gsite crawler and crawl your entire website. Create and submit a XML sitemap of your website to Google webmaster tools. Create links from your pages that are already indexed to this page (pages). That way, Google bot will find its way eventually.
2. Update fresh content on the page. Create a RSS feed of the content updates very frequently and serve it up front on the homepage or an important page of your website (which ranks well in Google).
All said, you have to wait and watch. There is no way you can forcefully ask Google to crawl your webpage. Also, updating your homepage content (just text with no link to your deep pages) wouldnt help in speeding up the process. But, its a good practice to keep your homepage content fresh so that Google bots visit your website regularly and you get Google love.
Hope that answers your question.
Well, A strict No No there on that practice. Your competitor would have issues in the long run. It seems quite easy to get folks to comment and it is aimed at "gaming" the google ranking algorithms. That practice would surely fall prey to Google webspam team one day. Its best not to tread that path at all. (by the way, Mattcutts- head of Google webspam, is one of seomoz's users...hope he gets to read this post and suggest algorithmic changes)
If you are competing in the local space, you might consider enrolling into local directories, Google places and ask people to write reviews on them. Company directories like hotfrog could be beneficial to you as well.
Sadly, domain age is still a factor in ranking and age old websites tend to rank better due to ranking history in Google.
Construction agencies have mocks, build ups and all kinds of work models which they can showcase. That is something you can identify and do well on.
Also, construction niche is a very less explored niche where you do not find local blogs catering specifically to that niche. It would be a good idea to start a blog giving advice, tips and all.
At the end of the day its all about engagement and I am sure that one day, you will be able to make it through to the top.
I would suggest that you goto GWT and do the follwing things.
Add all your regional website and set geo targeting in GWT.
That can be found in site configuration>settings.
Let Google know that its indexing the wring versions.
Sitelinks can be removed from the GWT itself in the site configuration link itself under the submenu sitelinks.
the 100 links is more of a guideline and not a strict rule as such. Your 1st objective should be to enable the page to be indexed. If Query Deserves Freshness(QDF) algorithms in Google will eventually index your URL. Its a matter of time with you linking to that page from atleast 1 page.
My advice would be to link it from more pages (if possible) and keep the content fresh.
Maybe you can even try the RSS idea as well.
Normally .edu and .gov websites classify themselves as trusted websites due to their entity association. Apart from that, there are renowned websites like wikipedia that normally provide unbiased views on topics. They are termed as trusted websites. There maybe inclusions depending on niche, but as a thumb rule, .edu and .gov are the most trusted domains.
If you have enabled canonical tag on your URL to redirect the shadow copies of your webpage to 1 location, it should be fine. In the long run however, you should think about getting a 301 redirect to your homepage URL for the URLs that are shadow copies.
I would suggest that depending on the number of pages, you can either do a htaccess based redirection or an apache mod redirect.
Here is an article which could help you.
Based on the version you have on your server, these documents would be helpful:
http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html
http://httpd.apache.org/docs/current/mod/mod_rewrite.html
Please note: this is not something you would like to do unless you know Apache confuration yourself and you know the codes and its logic.
I dont think that there are legitimate ways to influence suggest for popular keywords. I have noticed one thing though: Popularity of a particular term leads to its inclusion in the suggest list. Example: I ran an awareness/social campaign to save our historical monuments from vandalism by making a website where people could scribble whatever they want.
That campaign went off really well with retweets and shares among good influencial folks. It got shared on bookmarking websites as well. Suddenly, I started seeing a keyword "responsible travel" coming up on suggestions. But as the momentum died, we lost that preference. Maybe the QDF algorithm kicked that keyword out?
Let me give you an example:
If there are say 3 copies of your webpage
www.domain.com; www.domain.com/index.php; www.domain.com/home.html
Ideally, you would want everyone to land on the 1st option, so here is what you could do.
Activate canonical for the url #2 and #3, in the rel=canonical tag specify the complete URL for the #1 option. That way, even if Google crawls the #2 and #3 URL, it will know that the URL that should be considered is the #1 URL.
rel=canonical does not redirect the page unlike a 302 or 301 redirection where the page is redirected to the URL you want.
Sounds fishy to me. Its not possible to get links the white hat way. There are some possibilities that I can think of. Paying for inclusions, getting majority of links based on press release product announcements or some viral element on the website which has been covered by reputable media sources.
It would be interesting to explore the full set of links.
a rel=nofollow tag on the embedded tag would be a good thing to do since that would tell Google not to give any link authority to that URL.