It looks you have a bit of redirect loop here - that might be what's hurting the server a bit
Posts made by Mark_Ginsberg
-
RE: How to solve the meta : A description for this result is not available because this site's robots.txt. ?
-
RE: Why use noindex, follow vs rel next/prev
It seems like they noindexed that page because it may be part of an antiquated version of the site navigation/structure, or part of the cms and not something they want to promote. Not sure how you got there, but when you get to the primary version of a category, and then click through to the next page, the items shown change via ajax and the URL stays the same, just with a parameter that this is the second set of items being shown.
With the url staying the same, for their primary path of navigation, I don't think rel prev/next would be relevant. And these other pages probably created by the cms but not easily accessible they've noindexed - that's my best guess
-
RE: How to solve the meta : A description for this result is not available because this site's robots.txt. ?
You should remove the robots.txt block on the redirect - if the redirect is implemented properly, and it directs straight to the new page and not through some sort of infinite loop, it should be fine for you server to handle.
The bots think those pages are real pages, thus they have them indexed - you have blocked them from visiting the redirect and seeing that it points to another page and that the redirect page should be replaced with the real page. If things are implemented properly, they would follow the 301 redirect and remove the URL of the redirect from the search engines.
If you provide a link to one of these redirecting URLs, I'm happy to take a look and see if I can help you.
-
RE: Huge google index with un-relevant pages
In terms of what you've written, blocking a page via robots.txt doesn't remove it from the index. It simply prevents the crawlers from reaching the page. So if you block a page via robots.txt, the page remains in the index, Google just can't go back to the page and see if anything has changed. So if you were to block the page via robots.txt, and add a noindex tag to the page, Google won't be able to see the page with the noindex tag to remove it from the index because it's blocked via robots.txt.
If you moved all of your old content to a different folder, and block that folder via robots.txt, Google won't remove those pages from the index. In order to remove them from the index, you would have to go in to Webmaster Tools and use the URL removal tool to remove that new folder from the index - if they see it's blocked via robots.txt, then and only then they'll remove the content from the index - it has to be blocked via robots.txt first in order to remove the whole folder with the URL removal tool.
I'm not sure though if this would work for the future - if you removed a folder from the index, and then added more content that was indexed previously afterwards, I'm not sure what would happen to that new content moved to that folder. Either way, Google will have to come back and recrawl the page to see that it has moved to the new folder, and then remove it from the index. So either way, the content will only be removed once Google recrawls the old content.
So I still think a better way to remove the content from the index is to add the noindex tag to the old pages. To facilitate the search engines reaching these old pages, I'd make sure there is a way the engines can get to them - make sure there is a path they can take to reach them.
Another good idea I saw on a forum post here a while ago would be to create a sitemap containing all of these old pages you have indexed and want removed. Add the noindex tag to the sitemap - using the Webmaster tools sitemap interface, you'll then be able to monitor the progress of deindexation over time - by checking how many pages on the sitemap/s of the old content are originally indexed as reported by webmaster tools, and then you can see later on how many of those pages are still indexed, this will be a good indicator for you of the progress of the deindexation.
-
RE: Website is not indexed in Google, please help with suggestions
Just an aside - you're going to have indexation issues - you have both www and non-www versions live on the site, with no canonicals pointing to one version. You also have index.php as a live page linked to from the logo. I'd definitely recommend implementing canonical tags across the site.
Mark
-
RE: Huge google index with un-relevant pages
there are a bunch of articles out there, but each case is different - here are a few:
http://www.searchenginejournal.com/the-holy-grail-of-panda-recovery-a-1-year-case-study/45683/
You can contact me via private message here on the forum and I can try to take a more in depth look at your site if you can give me some more detailed info.
-
RE: Huge google index with un-relevant pages
Are you sure you got hit by Panda before we talk about a Panda hit?
-
RE: Huge google index with un-relevant pages
Exactly - I'd build a strategy more around promoting pages that will have long lasting value.
If you use the tag noindex, follow, it will continue to spread link juice throughout the site, it's just the individual page with the tag will not be included in the search results and will not be part of the index. In order for the tag to work, they first have to crawl the page and see the tag - so it doesn't happen instantaneously - if they crawl these deeper pages once every few weeks, once a month, or even longer, it may take a while for these pages to be removed from the index.
-
RE: Huge google index with un-relevant pages
Hi Assaf,
(I'm not stalking you, I just think you've raised another interesting question)
In terms of index status/size, you don't want to create a massive index of empty/low value pages - this is food for Google's Panda algorithm, and will not be good for your site in the long run. It'll get a Panda smack if it hasn't already.
To remove these pages from the index, instead of doing hundreds of thousands of 301 redirects, which your server won't like either, I'd recommend adding the noindex meta tag to the pages.
I'd put a rule in your cms that after a certain point in time, you noindex those pages. Make sure you also have evergreen pages on your site that can serve as landing pages for the search engines and which won't need to be removed after a short period of time. These are the pages you'll want to focus your outreach and link building efforts on.
Mark
-
RE: Duplicates of Home page with strange parameters
That post was from a week ago - it's still relevant, for sure.
The variations should be removed - canonical is an indicator to the engines, it's not going to destroy your site if you canonical pages to themselves.
I use the Yoast Wordpress SEO plugin on all of my own/client sites, and he has the self referencing canonical there as well. I think it's just good practice to prevent the types of issues you have mentioned.
To find the links to your site with these parameters, use Open site Explorer from Moz, and also check the links mentioned in Google Webmaster Tools - it's under the traffic section in the navigation. They'll both let you download lists of external links pointing to your site, and then you can see the links that are causing these parameters.
-
RE: Duplicates of Home page with strange parameters
Hi Assaf,
In your case, you should definitely add the rel canonical tag to your homepage. In general, I think it makes sense to use the rel canonical to prevent issues just like you've written about.
Parameter handling in webmaster tools is more to block Google from crawling those parameters or tell them how to treat the parameters you use on your site - for your issue, the canonical tag should do the trick. You can refer to Dr. Pete's article here on the Moz blog for this issue and similar canonical issues.
Bottom line - on your site, use the rel canonical tag.
Hope this helps,
Mark
-
RE: Is there any good tools for toxic link checking.
Link Research Tools breaks down backlinks as well and will classify links for you as toxic - see here - http://www.linkresearchtools.com/news/link-detox-clean-backlink-profile/
I haven't used this particular feature they offer, but in general, their backlink analysis and breakdown tools are very good.
-
RE: OK I'll try again.... Linking root domains and external links
Hi Jeremy,
Good question - in the crawl diagnostics, the number of links is the number of links from your page to other pages - these links can be both internal and external. It's a measure of the total number of links on your page, and tries to show how the value of your page gets diluted through your various links.
The next two columns, of page authority and linking root domains, tries to show the strength of the page in question. Linking Root Domains is the number of root domains pointing in to that page. I believe it includes both internal and external - thus, a page with 1, usually means it's only being linked to internally. On this particular crawl test, number of links is the number of OUTGOING links on the page crawled, whether those are linking to another page on your site or to external domains. Page authority is a measure of the strength of your page, on a scale of 1-100, and linking root domains is the number of domains pointing to your site.
They show you the strength of the page because if the page is strong, though there are more links on the page, this may still be ok, because the page is stronger and has more juice to spread across the site/web.
Mark
-
RE: Keyword at homepage
It really depends on the site - often a website will have the most links pointing to the homepage, so this is usually the strongest page. Thus, it may be easiest to rank your homepage for your primary keyword. On the other hand, if the most important keyword is a commercial term that will lead to higher conversions/sales from an inner page, like a product page, then don't forsake those higher conversions for the link benefits of the home page.
In addition, if you're going to be aggressively link building for your target term, if your inner page gets hit with a penalty, you can always 404 the page, remove the links, and start over. This is much harder to do when the overoptimized, penalized page is the homepage. So from that perspective, it may pay to optimize an inner page for your target keywords and not your homepage.
Your homepage often serves as the front page for your business - as such, it has to cater to multiple audiences, and often multiple parties who all are fighting for space on the homepage. When I worked at a not for profit museum, there were lots of different involved parties vying for space on the homepage. Thus, it may make more sense to target your primary keywords to inner pages where you can completely control your content and onsite setup.
Thus, while often it makes sense to target your most important terms to the homepage, it really depends on the site and the individual circumstances, and I wouldn't recommend targeting your homepage with your most important keywords as a blanket rule for all cases.
Good luck,
Mark
-
RE: Avoid Multiple Page Title Elements ?
Not quite sure what you meant by all of that, but if I understood correctly:
You have two <title>elements in your code:</p> <p>one on line 4 of the code:</p> <p><span class="webkit-html-tag"><title></span><span>GamesChannal – Video Games Review</span><span class="webkit-html-tag"></title>
One on line 1739 of the code:
<title></span><span>gmaeschannal</span><span class="webkit-html-tag"></title>
this is immediately preceded by another tag on the page.
Basically, your site isn't coded properly, and you should fix it - you should only have one section, and one <title>element - there may be other elements duplicated as well, but this is what I see from a quick check of the code.</p> <p>Mark</p></title>
-
RE: Rank tracker free one and paid one
A great free quick tool for running KW checks is the SEO Book rank checker - it's an extension of Firefox and is pretty easy to get set up and running in no time
-
RE: Best X-Cart SEO Solutions / Modules?
This is the SEO solution I've been using on a site and it's been pretty good. Just make sure you have good site developers handling and maintaining the site - if the site goes down or becomes very slow and unresponsive, no SEO plugin is going to help you.
-
RE: How Google Adwords Can Impact SEO Ranking ?
I totally agree with you and have seen the same thing myself.
The other positive aspect of running PPC ads is that it increases your exposure (both search and display), and leads to a rise in branded searches. Branded searches, like direct traffic, often convert pretty nicely, and helps to send more signals to Google that you're a brand and should the lovely treatment they often provide to brands. So I do think there are residual benefits to ppc advertising for SEO.
-
RE: Mozbar sees the 301, but no other header checker does
The Chrome extension Redirect Path (built by Ayima) does see this as a 301 - it's an awesome extension and very helpful, I use it all the time.
Google is definitely seeing this as a 301 redirect - when you run the info: site operator, on the URL, they show you the horizonblue URL - they def understand this -bcbsnj url has been replaced with horizon blue - check it for yourself here