Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Link Juice flow control
No, nav and footer links do not suck juice that you could otherwise redirect somewhere else via 'nofollow'. Yes, juice is split evenly between all outgoing links on a page. So if you reduce the number of links (by deleting them, not just making them 'nofollow') then you can increase the juice passed to the remaining links on the page. This is why it's rarely a good idea to have a page with tons of links on it; each one is going to only pass a little bit of juice. Having said that, nav and footer links can be very helpful for the humans who visit your site, so you don't want to strip them all away and make your site hard to use. But If you have dozens of barely-useful links jammed into your footer then you should probably nuke those. Design your site for humans. And trust that Google will do something reasonable with the link juice flowing around your site.
| scanlin0 -
Does Google take into consideration the number of ad tracking pixels on a page into its ranking algo?
I've heard people ask that question many times. And the answer has always been (from many people) "no. it's a non-issue". So I don't think you have to worry about it. Here is one thread on the subject (there are others if you Google for it): http://www.webmasterworld.com/forum3/8461.htm
| scanlin0 -
I think I'm stuck in a 301 redirect loop
Wow, I've been working with Miva's Design Club, going back and forth on how to fix this, for over a month. I should have asked the question here. I can finally get down to fixing this issue. Thank you! jr
| Technical_Contact0 -
Duplicate content connundrum
Jaime, I'm a firm believer that there is no duplicate content problem except on-site - I would simply post the article on your site/blog and give a citation/link to the orignial source. I wouldnt bother but you could just post the first half and link on for the rest. This is after all what Google News does.
| TellThemEverything0 -
Best way to Handle Pagination?
No I meant use /blogs/ as the first page and /blogs/pX for the next pages, X being the pagination number. These pages are valid and are not 301 of course. BUT, /blogs/p1 is the same as /blogs/ so you should 301. AND you must be aware of inexistent pages in the pagination (p10000 because you don't have 10000 pages of paginated results; /blogs/p01 or /blogs/p02 because these pages should not exist)
| baptisteplace0 -
Ranked on second page for keyword; now not within first 1000 listings?
Ok, so it seems that we have articles in our blog that have been syndicated throughout article submission sites.Such as articlebase and the likes. From what i've read these sites have been hit pretty hard due to the content farmer update. This is more than likely why we were hit so hard. I'm in the process of having unique content written and posting to the site also, removing any and all duplicate content. Hopefully, this will put us back in our previous ranking position.
| Tony19860 -
Duplicate Content
Thank you for you response, I have heard mixed responses, but generally other people agree with you. I will work on improving the on site country specific optimization to improve conversions!
| -Al-0 -
Top Level Domains
Thanks guys, great insights! @Nick I do have multiple ccTLDs for the same site. The content for each, however, will be significantly different. By 'domain-mapping' I meant actually getting into the DNS records and mapping the ccTLD URL to a sub-domain Rel canonical redirect: I'm assuming that the .com.uk would be the canonical page? If this page is the canonical page, and the com/uk/ is the 'discounted' page, what happens if the rest of the site uses the .com/uk convention? (In other words, is it advisable to have this inconsistency [both from a usability, index point-of-view]?) @Gary I think this is a very interesting point. I agree with both of you that if I saw a billboard for domain.com/uk, I might think it to be slightly odd. However, I'm not sure if consistency trumps familiarity or not. Further down the rabbit-hole: I will have multiple languages (let's say en, fr, es). I want this to utilise sub-directories (I want to avoid super-fancy AJAX whatnot. I HATE Google's help page URLs, for instance). domain.com/us/en/ domain.com/us/es/ The idea here is that the site rank for multiple languages, within a country (without creating super-duper long URLs). Any ideas/tips? Maybe a quick outline might help: 1 - Main (sort of a splash/navigation page) 1 - USA 1 - EN 2 - ES 2 - UK 1 - EN 3 - France 1 - FR 2 - EN
| RADMKT-SEO0 -
Implications of hosting country versus actual trading country?
Hi Andy - I wish I could move away from them, but seem to be locked in. Yes, there are databases involved. Thank you. I am now looking into hosting in the UK. You live and learn - but sometimes at a price. Thanks again.
| PH2921 -
What has to be changed to improve rank?
Add more unique content, it may have been dropped after the last google farm update due to the page not being unique enough, as the others have pointed out, just build more links, obvious but true.
| moldybacon0 -
Some sites like bbc.co.uk place the most important category links at the bottom of the page while other sites will place the whole site map there. What are the benefits (or not) of both approaches?
If I were to put a site map at the bottom of a page, I would not list every single page but only the most important category and page links, along with the links required for legal purposes. You don't want footer links to drain your page of link juice, nor overwhelm your page design. What's more, placing a ton of the same links on every single page of a site is definitely considered spammy these days. For news sites, yes, the lead story should be the first thing a user sees. For most other sites, though, a few links in a well-designed top nav bar should be placed near the top of the page or directly underneath a branded banner.
| Christy-Correll0 -
What is consider best practice today for blocking admins from potentially getting indexed
Agreed with the above two answers. Use an obscure url and use meta tags to noindex/nofollow the pages. I wouldn't worry too much about people finding your admin pages. You should already have security measures in place that prevent people from hacking your site or "guessing" your admin credentials. If you don't have these types of measures in place then I would recommend concentrating on these. Some ideas of things to look at: Ensure pages do not allow SQL injection attacks Use complex usernames and passwords Stop people from entering the wrong username and password more than x times within y minutes (e.g. lock out the account either permanently or for a temporary time restriction) If someone tries to enter a username and password within a given period of time, prompt them with a captcha check to ensure no bots are trying to access the site Ensure passwords are changed regularly Set up an alerting system should incorrect credentials be entered Plus there are LOADS more things you should do
| perfectweb0