Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Block all search results (dynamic) in robots.txt?
You can block the urls which has term "/product/search/" in them. It can be easily done by adding the following to the robots.txt User-agent: * Disallow: /product/search/ Hope this helps...
| onwebtoday0 -
Should I let Google crawl my production server if the site is still under development?
Thank you for the detailed response, Paul. I'll get cracking on your suggestions. I was mostly worried that if I blocked it now, it would be mad at me later. You've given me a way to deal with the bot concerns. I am less concerned that anyone will find these pages. I only knew about their index status because of one of my monitoring services which alerted me that google was crawling.
| DoItHappy0 -
Philosophy & Deep Thoughts On Tag/Category URLs
Hey Mike Great question(s)! 1. Are indexed tag/category pages really a duplicate content problem, and if so, why the heck**?** Since we are getting philosophical - let's define "what is duplicate content"? in the first place. There's two different types really; technical duplicate content - this is the kind we're referring to here. It's not real duplicate content (like you're trying to copy the same article or something over and over, it's not even cross domain). Technical duplicate content is there as a result of a function of the CMS or web development. Like tracking parameters, non-canonical homepages (www, non-www, /index.heml all loading etc), sorting functions on ecommerce sites. actual duplicate content - this is more like when someone has scraped an article from one domain to another, or copied an article on purpose - to actually try and pass it off as "unique" when it's totally copied. Tags & categories sort of cause "technical duplicate content" but not always. It depends how you have WordPress set up. Most commonly, I see them create duplicate content in the sense that a tag archive might look almost exactly the same as the article page its self - or very similar. OR what a lot of people are referring to and don't even realize it (which is a bit of a pet peeve) is the subpages off of tags and categories. When tag and/or category pages paginate (again, depending on how it's set up) the title tags will look like duplicates. ie: /tag/exercise-and-nutrition/ has the title tag: Exercise and Nutrition - Healthblog.com /tag/exercise-and-nutrition/page/2 etc _still has the title tag: _Exercise and Nutrition - Healthblog.com So the question really is - if tag/categories are "technical duplicate content" is THAT type of "duplicate content" an issue. I've heard Google say: NO. John Mueller from Google has said multiple times in Webmaster Central Hangout Help Videos - "Google can distinguish this sort of accidental duplicate from real duplicate content". BUT - not so fast - tags and categories can still be an issue, just NOT because of "duplicate content." It really all depends how you have them set up. 1. I first recommend understanding the distinctions between tags and categories (image from my WordPress article) 2. I do recommend indexation in categories by default in most cases. Not sure where you've also heard to noindex categories. That's IF they are used correctly per #1 above. If you use 5-8 well constructed and chosen categories there should not be a problem with indexing categories. 3. Noindex subpages of archives - this kills 95% of what some folks mistakingly call "duplicate content" and is really just duplicate title tags from the pagination of subpages. 4. I highly advocate leaving some tags indexed (using the Yoast SEO plugin) that are bringing traffic - here's how I do that analysis when de-indexing tags. Here are the REAL issues that tags and subpages CAN create; index bloat - lots of pages getting indexed that fill up the index and distract from what you might prefer to rank for instead poor user metrics from Google results - users tend to bounce off of tag archives, creatig lower user metrics, which can feed back into rankings dilution of content - so while this isn't "duplicate content" is is content dilution: multiple pages that all sort of overlap in topics. 2. Is there a strategy for ranking tag/category pages for news publishing sites ahead of article pages? Totally! Check out Kane's comment on my WordPress post - essentially he is saying to customize your category archives with some unique content on them, as to distinguish them from being posts. Also, only display excerpts of your posts on archive pages. We always cite SugarRae's blog as a great example. Check out her category page here. It has totally unique content at the top, and the posts below. - To conclude, and keep it philosophical I think what you're also getting at here, is an important part of SEO (or anything) that people don't talk about as much - but that's the idea of keeping an open mind, analyzing your specific situation, testing, testing the limits of "rules" - and really applying your own brain. Validate things for yourself. One of the biggest issues, is that most people do not use tags in a deliberate way or really understand how they fully function. They just slap 20 tags on every post (which they think is a magic SEO trick) and end up with thousands of tag pages (I've seen sites with 7,000+ tag archives!) - at the beginning this might not be an issues, but over time if done recklessly like that, it can cause some of the problems noted above. Great question! -Dan
| evolvingSEO0 -
Optimizing for Venice.
Hi Greg, In order to qualify for inclusion in the local results, your client must: Have a physical street address in the target city (not a P.O. box, virtual office or shared address) Have a local area code phone number matching the target city (not a toll free number, call tracking number or shared number) Have face-to-face transactions with customers, either at the place of business or at the customers' homes or businesses. If your client can answer yes to all 3 of these criteria, then you can create a strategy including some or all of the following components: A high quality website with excellent content optimized for both his services and his geo terms. Optimize title tags, meta description tags, header tags, alt tags, internal links and copy so that they reflect both what he does and where he does it. Build a unique page on the website for each of his distinct services (water heater repair, sewer repair, septic service, etc.). If the business model is go-to-client (like a plumber), build a city landing page for each of the main cities he serves (see: http://www.solaswebdesign.net/wordpress/?p=1403). Be sure that every page you publish is unique. No cutting and pasting from one page to the next. Put the business' complete NAP (name, address, phone number) in the footer of the website and on the Contact page, preferably encoded in Schema. Consider having an onsite blog to continue publishing local-focused content over time. Create a Google Places for Business Page, adhering strictly to the Google Places Quality Guidelines (see: http://support.google.com/places/bin/answer.py?hl=en&answer=107528) Begin building citations on a variety of indexes and directories to strengthen and diversify the number of places in which your client is being profiled. Be sure NAP is totally consistent everywhere it is published. Work to earn reviews from happy customers. Consider linkbuilding and social media efforts that will build authority. This is just a brief summary. Every step has its nuances and there is more you can do, but this should get the client started with building a Local presence on the web.
| MiriamEllis0 -
Can't seem to get traffic back post Panda / Penguin. WHY?
We didn't get an instant hit. It was a 50-60% drop spread from aug to end of feb in a pretty steady downward slope. It has been pretty level from feb until now. That's why I'm a bit unsure of the reason for the drop. I havlocal ready cross referenced the google updates and there is a rough coronation between the updates and the start of the declin but nothing concert.
| mark_baird0 -
Nofollow in site archutecture. Good or bad in 2013?
That does answer your question, but you still have the issue of so many links on every page. In my experience you don't need to stick to the "guideline" of 100 links per page, especially on an eCommerce site with multiple sub-categories all linked to from the navigation. However, there are many ways around this. For example, you can link to main category pages and sub-category pages from the top nav, and only show the further tertiary categories and drilldown / faceted links in the sidebar for that category if you are on of the pages within that category. Make sense? This puts some of your product pages one click further away from the home page, but that is fine. I tend to cringe when I see totally FLAT architecture on an eCommerce site that big anyway. Use of breadcrumbs, related product links, footer links, sitemaps and good top-level and sidebar navigation will ensure your entire site gets crawled easily and pagerank distributed properly without having thousands of links in the header navigation. Good luck!
| Everett0 -
Development site crawled
Unfortunately, robots.txt won't prevent your site from being crawled and indexed if there is a link from an external site pointing to yours. What you need to do is use on all your development pages. I don't know how big your site is, so this may or may not be a lot of work. Do this, then after the next Google crawl, your pages will be dropped from the SERPs.
| ollan0 -
When to write long content on a general informational keyword: Ecommerce
Then create a main "hub" page (the "Complete Guide") that links out to each of these articles, as well as to your actual product page. This is a very good idea. It is actually like the category pages on an information site. These can be very competitive because they are optimized for the difficult term, they often have lots of content, and they link to much deeper content elsewhere on the same site - and those pages link back.
| EGOL0 -
Dealing with Spammy Affiliate Site Copies
I would do both #1 and #2. If you can contact them and ask them to take the content and links down I would do that as well, documenting the request for future use in the DMCA complaint or in a disavow link request in GWT - if needed.
| Everett0 -
More authority back links but lower MozRank than competitor
The result count at the top is always the total count, regardless of how you filter. I'd look at only external links that are 301ed or followed, and that should be a much more accurate number.
| Dr-Pete0 -
Homepage 302 Redirect For SSL
Hi Mike, We used the SEOMoz crawl test tool that provided the results. The below stats are for the http://domain.com and the https://domain.com (we have a 302 from the http:// to the https://. Results are exactly the same. <colgroup><col span="4" width="65"></colgroup> | Internal Links | Linking Root Domains | External Links | Page Authority | | 216 | 58 | 259 | 45 | | 216 | 58 | 259 | 45 |
| Rogs.SEO0 -
Can I, in Google's good graces, check for Googlebot to turn on/off tracking parameters in URLs?
No problem Ashley! It sounds like that would fall under cloaking, albeit pretty benign as far as cloaking goes. There's some more info here. The Matt Cutts video on that page has a lot of good information. Apparently any cloaking is against Google's guidelines. I would suspect you could get away with it, but I'd be worried everyday about a Google penalty getting handed down.
| john4math0 -
Page not appearing in SERPs
Thanks. I'm calling Google broken! or at least my version. Not only can I not see my site at all, there are 2 pages from the same cheap flights website and another holiday company I've never heard of (it's my job to hear of them!) Very strange...
| Cornwall0 -
What is the best way to optimize/setup a teaser "coming soon" page for a new product launch?
Thanks for our laugh for this morning Mike. We had no idea we were ranking for that!
| KeriMorgret0 -
How use Rel="canonical" for our Website
Yeah, I'm with Mike - these are prone to cause you some real trouble. Given how many there probably are and how often they change/rotate, I'd strongly suggest using rel=canonical or some not indexing the alternate offers somehow. They may be necessary for users, but these pages aren't all necessary to have in your index. By trying to rank for every single one, you risk harming your more important rankings. Honestly, as Mike said, Google can't even really tell these are different, except for the URLs, so even the long-tail ranking benefits are nearly zero, I suspect.
| Dr-Pete0 -
How to Fix Duplicate Page Content?
Just wanted to add a note that our tools do not detect duplicates across domains or on other websites, so these warnings are completely tied to your own pages/URLs. These are "near" duplicates in our view, and Takeshi is right - there are many possible solutions. I'm guessing you can't directly combine them, from an e-commerce standpoint, but I would suggest either making a "parent" page and using rel=canonical, or just making sure there's navigation between the formats/versions and then pointing rel=canonical to the most common version (i.e. that your customers buy). Technically, this will remove one version from ranking consideration, but I think that's preferable to having 100s or 1000s of versions out there and diluting your ranking ability or even having Panda-related problems. It's one thing if you have Amazon's link profile, but the rest of us aren't so lucky.
| Dr-Pete0 -
Old links showing on new domain
Thank you for your comments. Did you find once the old links disappeared ranking improved?
| jj34340 -
How do I Syndicating Content for SEO Benefit?
Hi Matt, Thank you very much for two valuable suggestions. I am agree with you. We have added product specification in form of paragraph. Now, We are planning to compile set of content for each and every product page for good user experience. It may create some romance on product pages. I will consider your URL length related input in next product import. If you have any additional inputs or suggestions for me? If yes so It will fabulous for me. Because, I have visited your website and You have done great work over there. This is biggest reason to love SEOmoz... Right platform to communicate more with E-Commerce marketing guys... just like you!!
| CommercePundit0