Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
WordPress URL Link Issue
Hi Kashif, Standard .htaccess file set by WP containing redirecting rules. In your site you have "friendly ulr's" chosen. Standard WP url is containing more information for Frontapage eg. index.php, for posts - post date etc... and rules in .htaccess file are responsible for redirect all of them to friendly url. There is no reason to be worry about it Marek Example of WP .htaccess file: BEGIN WordPress RewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] END WordPress
| mad2k0 -
An EMD with top level domain of another country. Still useful ?
Hi Vikas, As you know the country specific TLD is a strong hint to Google about the target audience for the site, personally I do not recommend this approach of having a different place in the domain name with an entirely different geo-specific TLD ext. Its confusing for both the visitors and the search engines. Here we have to also look at how important is the user experience factor from search engines point of view and I strongly feel that UX can be a very important factor while ranking a page in Google going forward if its not already one today. Best regards, Devanur Rafi.
| Devanur-Rafi0 -
Crawl Diagnostics Report 500 erorr
500 errors could be caused by a mulitude of reasons, and for the non-technical they can be very hard to track down and fix. The first thing I would look at is if it's a repeating problem in Google Webmasters Tools, or a one-time issue. These errors will show up in GWT for a long time - but if it's not a repeating problem it probably is nothing you need to worry about. Wait, I assumed you found the problems in GWT, when you may have possibly found them using the SEOmoz crawl report. Either way, you should probably log into Google Webmaster Crawl Errors report and see if Google is experiencing the same problems. Sometimes 500 errors are caused by over-aggressive robots and/or improperly configured servers that can't handle the load. In this case, a simple crawl delay directive in your robots.txt file may do the trick. It would look something like this: User-agent: * Crawl-delay: 5 This would request that robots wait at least 5 seconds between page requests. But note, this doesn't necessarly solve the problem of why your server was returning 500s in the first place. You may need to consult your hosting provider for advice. For example, Bluehost has this excellent article on dealing with 500 errors from their servers: https://my.bluehost.com/cgi/help/594 Hope this helps! Best of luck with your SEO.
| Cyrus-Shepard0 -
Duplicate content problem?
I know of homestead, from many years ago, but not enough to know the inner working of it. With most systems like that you can't add canonical tags and you can't do redirects. I don't even know if you can make a webmaster tools for it. For a business, homestead is a very bad idea. If you could make a webmaster tools, you could do that, then set up the domain on a shared server and build the site there, then tell webmaster tools you are moving the site. If you know of incoming links, you could ask the other sites to change the links. Maybe someone else here has better internal knowledge of homestead. Unless they are receiving substantial traffic, you may be better off starting fresh, then change all the pages on homestead to address some particular part of their business, with completely different (related) content, and have links from that to the new site. I just saw bjgomer13 say the same.
| loopyal0 -
Can too many pages hurt crawling and ranking?
Hi, I don't believe having toooooooo many pages will hurt crawling and ranking. Actually having a lot of pages will give crawl bots more pages to crawl and when someone searches for keywords related to your pages, your pages might show up. The only 2 problems I see from having too many pages are With all these pages, are they all unique? With a lot of pages, it will be hard to manager and to keep track if all of them are unique. If you don't have unique pages and have a lot of duplicate, that will hurt your ranking. The second problem is are you inter-linking all your pages? Can the bot crawl all your pages? You will need to have a good linking system and direct bots to different pages for them to crawl. Having a lot of pages will be difficult to manage as I mentioned above. Can you interlink all of them so the bots can crawl all of them? One solution I see to this is submitting a Sitemap but I am not sure if they will index everything since I had a problem with Google only indexing 4% of my sitemap and still can't find solution. Hope this helps!
| TommyTan0 -
Possible penguin hit but then back, now what's next?
Ouch, sorry about that, attached the image . what do you recommend i shoudl do now ? going to check the links you mentioned above.
| wickedsunny10 -
Title errors for pages behind a login
Well, I guess if these are secure pages the correct approach would be the HTTP 403 status code (Forbidden). For example, this is the way apache forbids access to PHPMyAdmin if you have set up IP filtering in PHPMyAdmin.conf. I am not entirely sure how your security is implemented or how the SEOMoz crawler deals with a 403 but it should just back off from these pages. Alternatively, if your URL structure allows it, you could block the SEOMoz crawler from these pages in robots.txt. Assuming your secure pages were in a directory called /secure/ we would need: User-agent: rogerbot Disallow: /secure/ Hope that helps! Marcus
| Marcus_Miller0 -
Page Rank gone - technical difficulty?
Hi Moosa, I guess you were right. Teh PR cam back friday evening. Seemed to be an update problem....
| accessKellyOCG0 -
How could i create sitemap with 1000 page and should i update sitemap frequently?
Thank you very much.
| magician0 -
Why I am a seeing an error for duplicate content for any categories and tags on my Wordpress blog?
Thanks. I was using multiple categories and tags for the same post. I believe this is causing my issue with duplicate content. Appreciate your help.
| brytewire0 -
How do i raise my product pages authority - ecommerce
I would not spend one moment checking the authority of my pages. Not one moment. Instead, spend that time getting generous product descriptions on those pages and create great images to display the product. All of those words will pull traffic from search on long tail queries. They will show google that you have a substantive site. They will inform the customer about your product and help make a buying decision. The images will help sell the product and reduce the number of product questions that you receive. Creating a great site with great content is what produces results. Page authority produces nothing. It is possible to develop huge page authority but be completely ineffective at the goals of your website. Create pages and sites that serve your intended visitors.
| EGOL0