Just a bit of clarification needed... what pages are marked "noindex, nofollow"? Sometimes NoIndex,NoFollow or NoIndex,Follow can be useful for certain pages.
Posts made by MikeRoberts
-
RE: To Follow or Not To Follow...... ?
-
RE: Title Tags: Does having the singular and plural version of the keyword hurt the ranking?
True true. SERPs for a singular will not be 100% the same as SERPs for the plural in many cases but there are often overlaps. Keyword research will help in determination of which may be the better trafficked and/or more valuable term. Natural inclusion in the body can potentially make up for lack of inclusion in the title. Also, considering that Google will in some cases change your title and description to better suit a searcher's query for which you are also relevant, you can't rely too heavily on title optimization alone as a factor in your ranking though it is a viable signal.
-
RE: 301 redirect all 404 pages
If people are only occasionally typing in "/troussers" instead of "/trousers" then let it 404. Its there to let people know "I'm sorry, this isn't here. Perhaps you misspelled something." You could always 301 it if you really felt like it because it wouldn't hurt anything in the long run.
Now, if you found that you're sending 500 people a day to a 404 page for "/troussers" when they're looking for "/trousers" and you find there are relevant inlinks pointing at the wrong page then by all means go and 301 those people to the correct page. They'll be better served by it. But if you're redirecting all of those people to "All Categories" then you aren't being thoughtful of the customer's needs.
Indiscriminately 301ing everyone to "All Categories" without considering what their intentions are is not helping that customer and will likely wind up with an ever increasing bounce rate on "All Categories".
-
RE: Optimizing web page
And to expand on Kevin's answer.... results in whitehat SEO (i.e. following best practices) can sometimes take months before it really kicks in. We're working towards the long haul not the short game. Once you have all of your on-page/on-site stuff done, now's the time to start looking at relevant links, guest post opportunities, outreach, social media, etc. And then you can come back to the on-page/on-site stuff, see how its been working, determine what else can be tweaked/fixed/changed, etc. and continually update your site(s) with new, fresh content on a regular basis. Lather, Rinse, Repeat.
-
RE: Title Tags: Does having the singular and plural version of the keyword hurt the ranking?
In most cases Google is smart enough to understand that a page relevant for "Wood Desk" could or should show up in searches for "Wood Desks" and vice versa. As such, it's not really necessary to make sure that you shoehorn in all of the plurals and singulars of your core terms. Worry about it more from a Human standpoint. Making the title more human accessible will help with clickthroughs, visits, and so on. Forcing multiple variations of the same word into a title in order to attempt catching every variable will probably make people skip over you. And ultimately, getting the qualified traffic is what much of SEO is about.
-
RE: Heavy Internal Linking Help
Thanks Joram. Part of me was thinking the internal links were an issue of higher priority and the other half of me was thinking it's not that big of a deal if no one is having issues with it.
As for the Search Bar, one of our coders is currently in the process of tweaking our auto-complete to make search more robust... so that will be coming down the line in the relatively near future. We've found that most people using our search bar are searching for highly specific terms like model numbers which, unfortunately, if not perfectly entered wasn't returning the correct information (which is being worked on as well). I believe I've suggested to management in the past about making the search bar more prominent but with our current setup the only feasible way of doing this involved adding a second line to the Top Navigation which then caused some conflicts with the dropdown menus. (Its something I can look into again though).
The Samples page... yep, its a mess. I've actually suggested creating samples category pages to make things easier but that's been shot down. There was talk of using CSS to hide the various sections until someone clicked to expand but that was held off at some point. Might be worth giving up on lessening the internal linking and looking at that again to make Samples more manageable.
-
Heavy Internal Linking Help
One of the sites I work on is a home improvement ecommerce website that does fairly well for its niche. One of the biggest problems that we're not sure how to adequately handle is a heavy internal linking issue. The homepage (http://www.fauxpanels.com/) has approx. 226 internal links which is mainly due to the navigation structure. There are far worse pages though (the Samples page http://www.fauxpanels.com/samples.php has over 800 internal links).
For the most part, management doesn't want any massive changes to the navigation layout. The Top navigation bar has a number of dropdown menus when you hover, the Left Navigation Bar expands to show more choices, and the Bottom navigation bar in many instances is just repeats of links that can be found elsewhere. Also, the product links in the body of the page can be found linked in the Left Navigation. This is not what I would personally consider the best way to handle navigation but the Customer Service Department has gotten numerous calls and emails over the years about how much people love our navigation and how easy it is to find things.
My thought was trying to lessen the amount of links by having things grouped more often into Category pages/hub pages where applicable so we can remove some of the links. We've also considered NoFollowing links but my understanding is that even if you NoFollow the link equity is still divided by the number of on-page links.
So, any of you much more experienced SEOs have any idea how I can lessen the heavy internal linking without completely re-doing the site's navigation layout and not harming link equity, ranking, etc.? Or, conversely, would you consider having an average 200-300 internal links per page not to be a real issue given the positive effect it has apparently had on user experience?
-
RE: 2013: Top 10 content SEO tips survey
These are all off the top of my head but I know i have guideline notes somewhere in the mess of my desk that we usually follow. These will change on a regular basis as is necessary in order to keep up with best practices and algorithm changes. The things posted below could all be completely different 3 months from now if Google decides to release a Zebra Algorithm targeting websites with puppies on them.
- Title tag: How many characters / words? Best keyword positions? Best tricks?
- Over 10 characters and under 70 characters though there are instances where titles will be truncated due to character width. I usually aim for 56. Keep in mind that Google can choose to replace your title in the SERPs with what they feel is more relevant. We stick to the format of either <sitename>[seperator] <relevant core="" terms,="" descriptor,="" or="" short="" sentence="" with="" terms="" near="" the="" front="">or we flip it.</relevant></sitename>
- Meta description: How many characters / words? Best keyword positions? Best tricks?
- Over 70, under 160... closer to 156 and similar to title can be truncated due to overall character width. Core terms nearer to the front where applicable. Keep in mind that Google can choose to replace your title in the SERPs with what they feel is more relevant.
- Meta keywords: Use them? How many?
- Bing and Yahoo can still make use of meta keywords. Google does not (unless you're News and using the new News related keywords tag). We often continue to include them but more as an idea concerning the interrelation of pages on our sites.
- H1 tag: How many characters / words? Best keyword positions? Best tricks?
- Core term or most relevant terms concerning the page. If it includes a keyword we aim for closer to the beginning but that may not always be necessary. Make sure they make sense for the over-arching theme of the page.
- H2-H6 tags: How many characters? Best keyword positions? Best tricks?
- We keep them concise and to the point. Core terms near the beginning where applicable.
- Image alternative text: How many characters / words? Best tricks?
- Short, sweet, concise and relevant.
- Text length: Minimum, maximum? What’s better: 1 long article or split an article in several pages?
- I stick to one long article with relevant h2-h6 tags highlighting important parts. Three sentences in word looks like nothing, on your page that may look like a huge paragraph. Keep paragraphs concise, 3-4 sentences each. Breaking things up with an image is always nice.
- How much links within an article? Min? Max?
- As are necessary. You can get away with no links if you really wanted to but a few can help point out other relevant copy either on your site or off. But too many links will look cluttered and spammy. This is more a personal choice but if it looks bad then it probably is bad. We had a basic rule of thumb of no more than 1 link per 100 words in the copy... this was not adhered to and often it was less links.
- Usage of keywords within links?
- Natural sounding links are best now. Stay away from overusing heavily keyword laden text links too often. One or two every now and then is fine but overall you want them to be simple and natural. More people link to sites using the site name, site url or phrases like "Click here" than they do with terms like "Cheap Red Widgets".
- Your favorite SEO tip?
- Canonicals are your friend.
- Title tag: How many characters / words? Best keyword positions? Best tricks?
-
RE: Can I 301 Re-Direct within the same site?
Same products, same copy, etc,?
If you want Page A to no longer exist and Page B to replace it in the SERPs then use the 301 to send people to Page B and pass link equity/seo juice.
If you want Page A to continue existing but you want Page B to potentially replace it in SERPs, use rel="canonical" so that Page A may still be found and visited but pass equity to Page B which will/should eventually replace Page A in the SERPs for the terms where Google feels your canonical suggestion is completely relevant.
-
RE: Bad use of the Rel="canonical" tag
I would not consider using Canonicals as a means to optimize your rankings in the SERPs. Remember that rel="canonical" is a suggestion, not a directive (http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394). Google can choose whether they feel your canonical is relevant to use or if it should be ignored. So adding that canonical from your category page to your home page when they are not similar enough and especially if there are no duplication errors will probably lead to Google choosing not to use the canonical suggestion.
-
RE: What can i do about link threaths like these? What does Google do abt these?
True true. Didn't mean to make it sound like I may have been singling you out. I personally wouldn't pay anything for removal but I can see how in some cases it would help things along... especially when your site is your livelihood and you need to get those links down as quickly and painlessly as possible.
-
RE: What can i do about link threaths like these? What does Google do abt these?
Looks like you have more problems than just that one guy's directories. As for what Google does about this, Nothing. If you paid for links and got hit by an algorithmic penalty then Google has already done what they're going to do... i.e. Devalue you until you fix everything wrong with your site and wait for a refresh to pick up that you've fixed everything.
As for getting the links removed from this specific person's sites, if there really is a legal case there then you can always try it but honestly I'd say that wouldn't be the smartest move. If there's content which falls under the guidelines to request a DMCA takedown then there is that route. Lastly, there is the Disavow Tool but I have not had a need to use it and don't know if that is really the best course of action for you.
Like Irving said, $20 per Link is a complete scam... but offering $30 to remove all of them just opens the doors for all the other link farms out there to attempt extorting people further.
-
RE: URL errors in Google Webmaster Tool
An explanation of the priority column from http://googlewebmastercentral.blogspot.com/2012/03/crawl-errors-next-generation.html:
- We’ve ranked the errors so that those at the top of the priority list will be ones where there’s something you can do, whether that’s fixing broken links on your own site, fixing bugs in your server software, updating your Sitemaps to prune dead URLs, or adding a 301 redirect to get users to the “real” page. We determine this based on a multitude of factors, including whether or not you included the URL in a Sitemap, how many places it’s linked from (and if any of those are also on your site), and whether the URL has gotten any traffic recently from search.
-
RE: Has any on else experienced a spike in crawl errors?
This leads me to a problem then. As per Dave (the author of the article), "using canonical tags will result in duplicate errors being suppressed. If one page refers to another as a duplicate, than that pair will not be reported as duplicates. Also, if two pages both refer to the same third page as their canonical, then they will not be reported as duplicates of each other, either."
But now that this change has gone into effect I have 2000+ more duplicate content errors appearing and they are all pages with rel="canonical" pointing to the original page. So, as he stated earlier in the post this has caused "the most negative customer experience we anticipate: having a behind-the-scenes change of our duplicate detection heuristic causing a sudden rash of incorrect "duplicate page" errors to appear for no apparent good reason."
Is this something that will eventually correct itself or is this something that will need tweaking of the new detection method?
-
RE: Tag archives in wordpress
I wouldn't add a canonical from a Tag archive to a post (especially if there are multiple posts in the Tag archive).
The SEO value of Tags (and Categories) comes from them creating a hierarchy in your site as well as creating relevancy signals between all of the posts that appear in that Tag archive. If you have 596 tags and there are tons that only have 1 post in them then those one post Tags aren't helping you. You may need to consider cleaning up your tags, checking traffic and ranking for your tags, re-tagging posts to the most relevant tag with good traffic and/or rankings, deleting the useless non-relevant tags, and placing 301 redirects from the removed Tags to relevant tags.
We're currently going through the same steps on one of the sites I work for but in our case there are only 274 tags to deal with.
-
RE: Do Dashes in Domain names hurt SEO ranking?
Spammy domains have been known to overuse the Hyphen... but using Hyphens does not make you spammy
Matt Cutts had previously stated that Google recognizes the Hyphen as a separator and the Underscore as a connector... i.e. "red-wigdets" gets read as "red widgets" while "red_widgets" gets read as "redwidgets". For keyword purposes, a hyphen is technically better but the difference is likely negligible. Also keep in mind the EMD update. If your core term is "cheap red widgets" and your domain URL features "cheap-red-widgets" then the EMD has made the previous positive name correlation into a less powerful signal.
Matt Cutts 2011 Underscore vs Dashes in URLs video http://youtu.be/AQcSFsQyct8
Matt Cutts 2009 Underscores or Hyphens in URLs video http://youtu.be/Q3SFVfDIS5k
Matt Cutts 2005 Dashes vs. Underscores blog post http://www.mattcutts.com/blog/dashes-vs-underscores/
-
RE: Has any on else experienced a spike in crawl errors?
I saw a huge spike after the last crawl. In my case, the canonicals we set on our site months ago to handle some duplicate content issues appear not to be seen by Seomoz's crawl. Though when I check for duplicate title & meta issues in Webmaster Tools I don't see the offending pages that SEOMoz is showing me. That leads me to believe something is happening with either how the SEOMoz system is reporting or how their bot is crawling.
-
RE: A good META title for a front page....
Here are title tag best practices straight from SEOMoz http://www.seomoz.org/learn-seo/title-tag that may help.
-
RE: Page Speed & SERPS
You might also want to check out https://developers.google.com/speed/pagespeed/ to see if there are any suggestions Google can give you to make your site faster.
-
RE: Keyword density and it's impact?
Personally, I would say I see the best returns from a mixture of #2 & 3.
Produce your page with the keyword(s) in mind, make sure your important relevant terms appear in places such as the title, body and/or the H1(h2, h3...) as is necessary and natural, use your keyword(s) in the copy where it makes sense without repeating them stupidly (I've found Thesaurus.com will keep me from using the same word 14 times because I can't think of a better way to say it that day), keyword density percentages are not something we should be worrying so much about because there is no magic density percentage that the algorithms see & pat you on the back for achieving, and bold/italic words as is necessary and reasonable on the page for emphasis but if you're just highlighting a single term/phrase repeatedly on the page because you want Google to notice it then I think you're just asking them to ignore it because you're trying too hard (and possibly missing out on other really great avenues because you've become to concerned over one thing).
Honestly, I don't think Google cares that you've decided what your important core term is, used it the "perfect" X% density and then highlighted it repeatedly so they can see it better... Google will rank you where it sees fit and for what it sees fit. We do our best to help them understand what our pages are about so that they get indexed and appear in the SERPs but more often it feels that diversifying leads to better returns organically than hyper-targeting.