Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Can URL re writes fix the problem of critical content too deep in a sites structure?
Your second URL got cut off, so I can't see the exact length. You'll definitely want to keep the URL shorter, but when I think of Site Structure in terms of site indexation, I'm thinking more in terms of # of clicks from the homepage than I am about URL structure (not to say URL isn't important). Matt Cutts has indicated that Google places less priority on keywords towards the end of a long URL (source), so keep that in mind when considering the value that will be placed on the page name itself (which will often be more important than the subfolder keywords used). I'd personally change them for user experience and because it looks cleaner and less spammy.
| KaneJamison0 -
What can i do to let google know we are a lifestyle magazine
On the picture of Demi Moore, you could have: Demi Moore, picture with article on breakup with Ashton Kucher. Is she seeing someone else? That is fine. Just don't put a bunch of keywords in here. Best
| RobertFisher0 -
Quick Seo question regarding 301 redirect
There is no garentee that a 301 will pass links juice. If you try to redirecet one website tothe other there is a good chance it will not pass link juice unless it is the same content. meaning switching domains. simpley 301 redirecting to the home page with also see the links dissmissed. Thyere is a matt cutts video thats points out that you can not take it for granted that they will pass the link juice, it has to be an honest legit rerason to redirect. but i have never heard of it hurting
| AlanMosley0 -
Modifying urls cause broken links?
Dan is correct you need a 301 to correct the problem Seems like you are trying to create friendly urls. Depending on what server you are using this can be very easy. If you are using a Microsoft server you can create the freindlly ursls along wuth the 301 redirects simply with a wizard. See half way down the page. http://www.seomoz.org/ugc/microsoft-technologies-and-seo-web-development
| AlanMosley0 -
Does this page crawl well?
but of course it will index the page. and it has! as far as... will it find the page useful or therefore worthy is not up to Google but your visitors. To make the page rank you need to include links from other pages that will eventually land your visitors to this page. including call to action buttons etc. I would also suggest to add more content to give the page some density as well. hope this helped
| lonniea0 -
Sharing the same content on every page
It depends how much info is there and where it is on the page. Never try and hide anything though as this can be seen as a negative from Google. They will wonder why you are trying to hide it? You could always change the content to a link and have it open a pop-up window with the same content if you want to get rid of this though? Cheers, Andy
| Andy.Drinkwater0 -
Duplicate Homepage In Google
I'm not a big fan of doubling-up on canonicalization tactics, because there's no good way to tell what's working. The 301 is probably a tiny bit stronger (not much, in my experience), but the advantage of the canonical tag on the home-page is that one tag will sweep up any variants. If you 301 "index.html" to the root, and then someone comes along and uses the non-www version of your home-page or adds a tracking parameter ("index.html?track=1234"), etc., the 301 won't do anything - you'll have to create 301s for each situation. The canonical will prevent those problems, which are very common on home pages.
| Dr-Pete0 -
Sitemaps for Google
Nice - thanks Kane. Cool Chrome tool too, thanks for the suggestion. I'm in GWT every morning to check things out since our site is fairly large - about 220,000 pages. The sitemap checker is a really cool new feature in GWT too!
| Prospector-Plastics0 -
Page crawling is only seeing a portion of the pages. Any Advice?
Ahh, I think RankSurge was referring to Google crawling, and you were referring to SEOmoz crawling? For SEOmoz, send an email to help@seomoz.org and let them know which account/campaign you're talking about, and they'll make sure Roger crawls the full site. Keri
| KeriMorgret0 -
Viral page not ranking onGoogle
Mind shooting over the Query and URL in question? I'd love to take a peek directly.
| KrisRoadruck0 -
Is Pinging important?
Hi Andrew. I just posted two new questions titled "What after I place a blog comment" and "Traditional link requests still the right way?". Please reply when you have time. Thanks, KS__
| KS__0 -
How do I check if my IP is blocked?
Hey - I am not sure if this is relevant: I just checked the Track Rankings for the other site (fortresslearning.com.au) and it came back with: We’re unable to retrieve your ranking. Is that somehow meaningful? Bryan
| FortressLearning0 -
Seperate Pages for similar keywords from SEO standpoint
There's a company called FutureSimple that does a good job of this in my opinion. Take a look at their pages for these two keywords: "Online CRM Software" - http://bit.ly/xK3KZQ "Web Based CRM Software" - http://bit.ly/yZaAow That is a good way to handle two slightly different keywords, that mean the same thing, but both have enough search traffic and competition that you have to target both of them individually. You'll note that they use a very similar page structure, but the images are different, and all of the text is different. If you use that method and all text and images are completely custom, then yes, I would create separate pages with unique URLs for each of those keywords that are fairly difficult. For keywords that are more similar, such as "send money to Cina from the US" and "send money to China from the United States", then no, I would not create multiple pages. Use the method above for synonym terms with lots of traffic going to each keyword.
| KaneJamison0 -
Why is google ignoring my sitelinks demotions?
It sounds like it's been a while since you demoted those pages. I think this sounds more like a bug that it's not happening. You could go to the Google Webmaster Forum and submit a request there or if it's urgent and the following solution is doable, you could try blocking those 3 pages via a noindex tag or via robots.txt and see if they get removed that way. Once the pages have been removed using that technique, you could remove the noindex tags or disallows in robots.txt unless you would prefer to keep it that way and you don't want those 3 pages to show up anywhere...period. I hope that helps.
| NakulGoyal0