Anyone have any thoughts about whether or not I should attempt to reduce the number of links in the sidebar via javascript? I've got 160 navigational links. Should I try to load these in via Javascript? How would Google react to that?
Posts made by TaitLarson
-
RE: What's the best strategy for reducing the number of links on a blog post?
-
RE: What's the best strategy for reducing the number of links on a blog post?
Hey Ryan, thanks for taking a look. nofollowing the comment links is a good idea!
I'm under the assumption the assumption that nofollow links are still hurting my SEO efforts do to how Google passes page rank through links.
Thanks for the tips on looking at other blogs and seeing how they use nofollow. Again, I guess the big guestion is should I be reducing the number of links on my site (including the number of nofollow links).
Can you give a more specific example of "hooks for people to link back to"?
-
RE: What's the best strategy for reducing the number of links on a blog post?
That seems like a pretty practical solution. Some people won't stop by any more but those are the people who are just trying to promote themselves. I wonder if it might actually increase comments b/c now the form will be even shorter and simpler to fill out.
-
RE: What's the best strategy for reducing the number of links on a blog post?
Thanks EGOL. To be clear, we remove most comments with links embedded in the text (unless they are relevant). However the "website" field of the comment form creates a nofollow link to a site if filled in.
I don't want to remove comments. People really do contribute in our comments section. Having comments show people stopping by our blog that we have a nice little community of readers. While there is some abuse we work to remove spammy comments and blacklist domains that continue to spam us.
-
RE: How can I check a website to see if it is "nofollow"?
Hey Domaon,
This happened to me once and I was furious. I would contact the publisher and show them this article.
http://www.mattcutts.com/blog/pagerank-sculpting/
Basically it explains that the site gains no additional PR to it's other links by nofollowing your links. Who knows maybe they'll chance their stance on nofollowing links to use submitted articles.
That said there are good reasons why a link can have a nofollow attached to it. However, if you contributed an article to an ezine that selectively publishes user submitted content you deserve a "dofollow" link.
-
What's the best strategy for reducing the number of links on a blog post?
I'd like to optimize my blog better for search. The first reccomendation I got from my SEOMoz Pro Campaign Crawl was that I needed to reduce the number of links per page on my site. I have lots of links from navigational items in the sidebar that people do click on. I'd really like to keep some or all of the tags and categories I list. Comments are another issue. Most of our posts get about 10 comments. However, our best posts get 50-100 comments. Those comments create a lot of links.
I was planning on attempting to reduce the number of links using javascript but I guess Google understands javascript now. I may still do this b/c our pages are huge and some progressive rendering would likely help the user experience.
Can you use javascript (ajax or otherwise) to limit the number of links on your page in a way that helps your SEO efforts? Any specific suggestions for reducing links that come from comments and navigational items?
How much will reducing the number of links on a given page help with SEO? Any simple way to estimate or quantify this without diving in?
Thanks in advance!
-
RE: Best Link Building Practice
Be sure to view Rand's recent webinar The Future of Link Building.
You can also try the Juicy LInk Finder tool from SEOMoz though I've had trouble getting it to work recently.
-
RE: Category Pages with Sub-Categories
You don't have to create anything to start planning out your URL structure. Lots of times when building something I think about the keywords and then just start laying out the URLs that I think will suport those keywords from a search and overall usability perspective. I'll do this even before I start mocking things up.
From your design you could put all mowers by one company on the same page and then have tabs to separate the content. Implement the tabs in javascript so that they just enhance some good solid HTML lists or divs. That would create one page for each mower brand. Are you hoping to SEO well for "mowers" or "toro mowers"? Perhaps you are already planning on doing this?
In regards to the anchor text, can you just link to "Toro Mowers" and then have people click the tab they want. Othewise perhaps you could do something link "Toro Mowers - Home", "Toro Mowers - Commercial". At least that gets the keywords at the front of the anchor text.
The structure of your site will be important for SEO but start by finding the list of keywords to target. Consider keyword competition and search volume when making the list. Come up with one page for each of those keywords. Then try to come up with an overall URL structure where you can SEO all those pages well. Finally, don't worry about SEO on each and every page you create. You are going to have some pages that add to the overall experience of your site that might not target keywords.
You are smart to think about creating only the pages that you have good content for. I wouldn't force myself to create any other pages.
-
RE: Category Pages with Sub-Categories
Could you be more specific? I'd love to see the full URL path before of the categories and sub categories before I weigh in.
-
RE: Are there any "legitimate" paid links in Google's eyes?
I always thought Google's policy towards the Yahoo Directory was strange. However, perhaps it teaches us that if the directory charges for something other then placement (i.e. pay us and we'll expedite adding your listing to our directory) then the paid link is "legitimate".
-
RE: Website redesign - how do I avoid screwing up my site SEO?
EGOL is hinting at something I"ll say more explicitly; why don't you split your site design into two phases?
- Phase 1: Optimize / Update the UI of your site. Keep the markup that's relevant to your SEO efforts the same or improve it.
- Phase 2: Change the URLs of the pages on your site.
Once the dust settles from Phase 1 and you've seen how Google has responded to you updates, consider whether or not you really want to do Phase 2.
All that said, I have implemented both Phase 1 and Phase 2 at the same time before. I was careful to add lots of 301 redirect rules to my .htaccess files using mod_rewrite. I did not experience any kind of rankings penalty from Google.
-
RE: Javascript
Again, great resources, Daniel. The first link provides some empirical evidence that ajax based links do get interpreted. SEOmofo had a nice recommendation that should stop google from indexing your JS if need be. He basically said put your JS in an external file that you disallow in robots.txt.
From your second link
The search appliance only executes scripts embedded inside a document. The search appliance does not support:
- DOM tracking to support calls, such as
document.getElementById - External scripts execution
- AJAX execution
Not exactly sure what "AJAX execution" means. However, if it means downloading JSON or JS and evaluating it that makes sense. Perhaps not external JS gets executed by google?
The third link discusses the "agreement" you can make with a crawler if you have an ajax based site using hash bang urls. Not super relevant for me but good to know so thanks!
- DOM tracking to support calls, such as
-
RE: Javascript
Thanks very much for this. Can't wait to check these resources out.
-
RE: Javascript
Hey Daniel,
Would you mind diving into that statement a little more? I didn't realize that Google could execute 90% of javascript. Do you think they will load in external javascript files? Does google make ajax calls?
I only ask the questions b/c I have a web site who's home page that has too many links and too much HTML. I'd love to use javascript to do some progressive rendering and keep some links and additonal HTML out of the initial HTTP response sent back when someone requests a page on our site.
Thanks in advance!
Tait
-
RE: Should I shorten my urls?
You could definitely redirect URLs using .htaccess and mod_rewrite. An example rule would be something like
RewriteRule ^q/(.*) /?question=$1
or
RewriteRule ^q/(.*)-(\d+) /?question=$2-$1
See the mod_rewrite documentation or just ask a competent developer about the rules above.
-
RE: Are post tags on blogs still useful?
One quick aside: categories in WordPress are easier to SEO then tags. Wordpress collects category descriptions. Plugins like All In One SEO Pack can use this info to write a meta description for you. Wordpress doens't collect descriptions for tags so plugins can' use this info to write out a meta description for your tag pages.
-
RE: The ultimate top 10 directory list
Here's 7 "directories" (using that term pretty loosely) where you could try to get listed.
1. Dmoz
2. Yahoo! Directory
3. GetListed (for businesses)
4. NetworkedBlogs (for blogs)
5. Crunchbase (if you've got a start up)
6. Technorati (for blogs...kind of outdated)
7. Wikipedia Categories (Rand discussed this in the most recent SeoMoz seminar).
-
RE: How to achieve the highest global and local relevance in google?
There was a really good Whiteboard Friday on this topic.
http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday
Rand discusses the tradeoff of the approaches you mention above. Be sure to also read the comments. Some commenters mentioned they had really good results using geotargeting in Google Webmaster Tools.
-
RE: Are post tags on blogs still useful?
Search engines can't really tell the difference between types of pages.
Do you want your tag pages to rank in search results? If so you could add a description. The All In One SEO plugin I mentioned could help with this. However, as I alluded to before, it's tough to get lots of tag pages to rank well in google. I wouldn't worry too much about adding meta descriptions for them.
-
RE: HTTP Headers
Some of those headers might help you serve your page faster. They might save you and your users some bandwidth.
I guess if you think that page load time is super important (most say it's a small factor) then you could argue those are important for SEO but genrally speaking Marcus is right.