Snap! 
Posts made by ShaMenz
-
RE: SEOMOZ .org?
Hi Bondtana,
I believe when Rand first started what was then a blog intended to share his own experiments in SEO, it was never intended to be commercial, and so the .org.
Obviously, it grew into a blog with a massive following and resulted in the company becoming what it is today.
Actually, I happen to think that the .org is still appropriate...it would be hard to argue that the site is not home to one of the most information rich, industry leading, generous & sharing communities on the Net. Not just in SEO.
Also hard to argue that there is a more authoritative information source in the Inbound marketing space.
Hope that helps,
Sha
-
RE: Website hacked
Hi Socialdude,
A look at that code suggests that the most likely point of access has to be a file that is more than just regular HTML somewhere on your site. This means that somewhere, there must be at least one php file.
My first guess would be that there is a page with a PHP driven contact form which has been used to inject code into the site and propogate the malicious javascript into the other pages.
If you have a clean backup copy of all pages in the site (either with your friend or their developer), then the quickest fix is to upload your backup version.
If you don't have a backup, then you could try checking the Wayback Machine and see if there is a clean copy archived there which you can grab and upload to replace the hacked site.
If neither of those is an option, then the first thing to do is to find any pages in the site with the .php extension.
Rename the files by changing the file extension from .php to .txt. (If you are unsure of how to change the file extension, you can just open the files, save a copy with a .txt extension and then delete the .php version from the server)
You can now look at the file(s) that were PHP, see what has been added to the code and clean it up. You will then need to individually edit the HTML files and remove all of the bad javascript code. Now that you have everything cleaned up, create a complete backup of the site just in case you need it again in the future. Upload your clean copy and you should be good to go.

I would also go to Google Webmaster Tools & use "fetch as googlebot" to fetch and add the index page so that Google knows you are now OK to crawl again.
Hope that helps,
Sha
-
RE: Google indexing thousands crazy search results with %25253
Hi again Joe,
After a more detailed look at your site (which has no obvious search box available to users) I was curious as to why all of the things that you are doing on the site seem to have no effect upon the issues you are trying to resolve...and why your site is generating thousands of search queries without a search box!
This says to me "do you have control of all of the content?" ... and it appears that you are using an external service called Pictage to upload and display client portfolios.
So, are you pulling content into your site from Pictage? Is it some kind of white label add-on to your site?
If the pages from Pictage are being generated externally, then the yoast plugin cannot add the "noindex" tag to those pages...if this is the case then I would say you need to contact the Pictage help people and advise them that there is a problem they need to attend to.
Hope that helps,
Sha
-
RE: Search Engine Pingler
Hi Ruslan,
If you have a Google Webmaster Tools account, you can use the "fetch as googlebot" feature to notify Google of up to 50 URLs per month. The way it works is that you run "fetch as googlebot" and if the page is fetched successfully, you are given the option to add it to the Google index.
Obviously if it is not fetched successfully then you know that there is a problem with the URL that needs fixing.
Bing also has a similar feature which allows you to manually add up to 10 URLs per day with a maximum of 50 per month.
Hope that helps,
Sha
-
RE: SEO Moz Tools - too many on the page links result driving me nuts
Hi Brian,
The feature is not live as yet, but has been lodged as a feature request and a staff member recently confirmed that it is in the works.

Enjoy your Sunday,
Sha
-
RE: Google indexing thousands crazy search results with %25253
Hi Joe,
A couple of things:
- If you have made the change to noindex search results recently, it may take some time for the errors to disappear from GWT. If the number of pages continues to grow, then clearly the noindex is not implemented as you expect.
- You could try using the parameter handling feature in GWT to tell googlebot to ignore all pages with the parameter in question. In your search string, the ? says "here come some parameters" and the "s" is the parameter that you want to ignore.
Incidentally, there is definitely something funky happening with the generation of those search strings which should be investigated and resolved as well.
Hope that helps,
Sha
-
RE: Google , 301 redirects, and multiple domains pointing to the same content.
Hi Steven,
OK, having looked more closely at your question and done some more investigation, it seems there are two major issues.
The first is as I mentioned, the fact that the non www is broken. I believe that this is because you do not have a setting enabled to allow the www and the non www to share the same location. Unfortunately I cannot tell you what you need to do to fix this, but have reached out to one of the SEOmoz Associates and hope that he will be able to help out.
Secondly, the fact that you have both domains pointed at the same root directory means that:
- Your domains are seen as exact duplicates of each other (a serious duplicate content issue)
- None of the link value from the original domain or any external links pointing at it are being passed to the new site.
So, the two things you need to do are
- Fix the non www /www issue
- Create a 301 Redirect that will send each page of the old site to the corresponding page on the new domain.
Hope that helps,
Sha
-
RE: SEO Moz Tools - too many on the page links result driving me nuts
Hi Brian,
The key here is that it is a "warning" intended to alert you to the fact that the situation exists so that you have the ability to do something about it if you can.
While there are many sites (especially ecommerce sites) which are highlighted by the tool because they have extensive menus, it is also quite possible that some users have pages where the body text is full of plain text links which could be fixed.
As to whether they should be fixed, I would say definitely "yes" if they are the latter. Pages that have way too many links in the text are detrimental to user experience. They are harder to read, look spammy and the message tends to get lost if people are constantly clicking a link and going to another page.
If, on the other hand, you have a site where the large number of links is the result of the menus it is a fairly simple thing to ignore the warning.
I should mention that there is a feature request in the works which would enable users to "switch off" items in the Pro App that can be ignored. When this is implemented you would be able to remove them from the report so you don't have to keep looking at them.
Hope that helps,
Sha
-
RE: Google , 301 redirects, and multiple domains pointing to the same content.
Hi Steven,
Welcome to Q&A

I think there may actually be a few issues here which are not helping, so am going to take a little time and do a bit more analysis so we can try to eliminate all of the problems, but there is one thing that seems worrying:
The non www version of your domain http://easylawlookup.com/ returns a 404 (Not Found) Error, while the www version http://www.easylawlookup.com/ loads correctly. My first guess on why you have lost rankings so suddenly would be that the other pages that were indexed in Google happened to be the non www version of the URLs - so when Googlebot tried to crawl and the site was down, then returned a 404 Error, those pages were dropped from the index.
It is also possible that the majority of your external links were set to point to the non www URLs.
If this was the case and your changes took down the server, then returned it with the non www no longer accessible, then all of your incoming link value would have also disappeared overnight.
I cannot say at this point exactly why the non www is broken and since ours is a LAMP Shop, I am not about to hazard a guess on what might be happening in your Windows setup to make this happen. There are definitely some IIS ninjas around the SEOmoz community though, so I'm sure someone will be able to jump in on the thread and provide some guidance on that.
General practice is to 301 either of the non www or www to the other (choose the one you prefer and redirect the other to it). This ensures that all of the incoming link value and traffic benefits a single domain. It also eliminates any possibility of the two being seen as duplicates of each other. There is no point trying to set a 301 though if there is something fundamentally wrong in the system.
So, the first task is to work out why the non www is returning a 404 and to fix the problem.
In the meantime, I would leave the old domain as is - I'll do a little more digging and get back to you soon.
Hope that helps
Sha
-
RE: Spam to site with GMail addresses - any way to resolve.
Hey Robert,
I'm still waiting for that app that allows me to respond with a great big hand that comes out of their screen, grabs them by the shirt and bangs their head against the monitor! 8D
Unfortunately it seems a long time in coming.
Sha
-
RE: URL rewriting from subcategory to category
That's great!
Glad to help anytime. Sha -
RE: URL rewriting from subcategory to category
Sorry that my answer appears to have lost all line breaks...seems to be some css issues at the moment.
Hoping you can copy it out and separate the lines ....I will try to reformat it as soon as I can, but right now it just keeps loading funky.
Sha
-
RE: URL rewriting from subcategory to category
Hi kundrotas, The problem you have is that the code used for .htaccess functions is not terribly "intelligent". It does not allow for the use of "IF" statements etc. This being the case, the better option is to move the action to within the actual code. This is the rule at question RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&%{QUERY_STRING} [L] The easiest thing to do is in index.php add a check for par1 // If par1 is 1 and everything else is blank send it to the root. if( $par1 == '1' && $par2 == '' && $par3 == '' && $par4 == '' && $par5 == '' && $par6 == '' ) { $location = "/$lang/$idr/"; header ('HTTP/1.1 301 Moved Permanently'); header ('Location: '.$location); } Hope that helps, Sha
-
RE: How to fix and test Google's indexing / caching problem
Hi WebBizIdeas,
It seems this is a known issue with your shopping cart software.
This thread should help answer your question
http://forum.cs-cart.com/topic/12883-problem-with-the-seo-addon/page__st__20
Hope that helps,
Sha
-
RE: Hi and Welcome Hackers & Founders - and Thank you for Sharing :)
No worries ... just spreading a little TAGFEE

Oops! Sorry Hackers & Founders...nearly forgot one of the most important things I wanted to share ... to really understand how things work at SEOmoz, you need to know about TAGFEE
Sha
-
Hi and Welcome Hackers & Founders - and Thank you for Sharing :)
Just got through watching an awesome presentation from Rand Fishkin at the Hackers & Founders event in California. Rand started out by announcing that all members will be getting 2 free months of SEOmoz Pro! So I got to thinking...it's quite likely there are going to be a few new faces around Q&A pretty soon

So, I wanted to say a few things to you all:
- Most important of all, Hi!
- Welcome to the SEOmoz community. I know the tools & content you'll find here can help your venture in so many ways and really look forward to following the story of your Startup as you reveal it to the world!
- Thank you! I'm half a world away in Australia and have yet to see Rand present in person, but today I've experienced the next best thing because you were kind enough to include me by producing and sharing this great video on Hackers and Founders TV! Well I've seen Rand present via video at two Mozcation events, but since I don't understand a word of Spanish and parts of his presentation were in said language, I'm not counting those! Anyway, Thanks! I loved it.
- If you are just getting to know SEOmoz there are a few places you absolutely should not miss
- The Daily SEO Blog
- Pro Webinars
- Special weekly video Blog posts known as Whiteboard Friday
- The Youmoz Blog - posts by SEOmoz users
- The Beginner's Guide to SEO - definitely the place to start!
- oh and Q&A, but you're already here!
- Hope to catch up with you somewhere in the community, whether it be here in Q&A, in the blog comments, enjoying a webinar, or maybe following @SEOmoz @randfish, @SEOmozPRO or even that handsome little devil @roger_mozbot on Twitter
Enjoy!

-
RE: Why does SEOMos Pro include noindex pages?
Hi Randy,
Basically, the crawler is not configured to remove pages with the "noindex" tag. Given that removing them would create a situation where you may have pages visible to users that might not contain information in other places like Titles, then leaving them in there is probably a reasonable choice.
However, there is a feature request in the works at SEOmoz which would provide the option to turn off the pages that you know can be ignored because they are "noindexed". This feature will allow us to have the best of both worlds - see all the deficiencies of the page IF we wish to, and turn them off so that they are eliminated from reports if we wish to ignore them.

For now, yes, you can ignore them.
Hope that helps,
Sha
-
RE: Things to consider with regards to SEO when redesigning a website
Hi Manuel,
This blog post Web Design Company Fail #1 will give you an idea of some of the things to watch out for when redesigning sites.
One trap with Wordpress sites is inadvertently using a template that has built-in SEO problems, so do some good research on the template you select to make sure that there are no known issues identified.
The other thing is to make sure that you plan right from the start to eliminate the standard duplicate content problems generated by default in Wordpress. Using a good SEO plugin for Wordpress is a good idea for this reason. I find that Yoast SEO for Wordpress is very helpful.
Hope that helps,
Sha
-
RE: Pages Crawled: 250 | Limit: 250
Yes, a full crawl will generally be completed once a week.
You can see the date when the current crawl is due to complete at the bottom right of the crawl diagnostics overview for your campaign.
This is what it looks like
Last Crawl Completed: Nov. 29th, 2011 Next Crawl Starts: Dec. 6th, 2011
Hope that helps,
Sha