Questions
-
What content should I block in wodpress with robots.txt?
Here is what I have changed it to...found various articles including the one listed above and decided to go with this, not sure if it is good or bad. www.ensoplastics.com/robots.txt
Intermediate & Advanced SEO | | ENSO0 -
Help needed with robots.txt regarding wordpress!
I've just looked at the home pages of the two sites and they are pretty much the same apart from substituting plastics with bottles. I'm not an expert but I would have thought Google might think this is duplicate content. In my opinion I would concentrate on one of the sites say plastics and have the bottle specific stuff as a subsection. I'm not sure how the sites rank etc so that may be easier said than done. As for the site map / robot question, if you continue with two sites then I would recommend generating a new one for the copied site.
Technical SEO Issues | | surfgimp0 -
Google Webmaster tools error?
It won't solve your duplicate page issues. Without looking at the site its not easy to say exactly why your getting duplicate page issues. You can start off by ensuring you have a 301 redirect from the root domain to the subdomain or vice versa - its up to you your-domain.com >> 301 redirect to >> www.your-domain.com
Technical SEO Issues | | lavellester0 -
Robots.txt is blocking Wordpress Pages from Googlebot?
Delete everything under the following directives and you should be good. User-agent: Googlebot Disallow: /*/trackback Disallow: /*/feed Disallow: /*/comments Disallow: /? Disallow: /*? Disallow: /page/ As a rule of thumb, it's not a good idea to use wild cards in your robots.txt file - you may be excluding an entire folder inadvertently.
Intermediate & Advanced SEO | | Desiree-CP0 -
Why does this page show it has 166 links in the crawll?
Links to categories/login/search/facebook/links in the post + 57 comments with links in them will easily add up to 100+ links on the page.
Technical SEO Issues | | Host10 -
So I am creating an xml sitemap but what can I do to make it look better?
Thanks for the response I had an idea that would be the answer.
Intermediate & Advanced SEO | | ENSO0 -
How do I back track Broken Links?
You can also use the really cool freeware tool "Xenus Link Sleuth" and crawl your own page. It has a lot more featrues to offer like page depth, header information etc etc. and best of all: it is free Just crawl your site and sort by status. Then rightclick on the 404ed page and see what pages link to that page.
Link Building | | Sebes0 -
Canonical URL problem
ensosplastics.com www.ensosplastics.com www.ensosplastics.com/index.html all bring the same pages. My suggestion would be: redirect all URLs to either www.ensosplastic.com or just ensosplastic.com. Google is pretty smart about discovering and judging index.html and / , but better make sure and put a rel="canonical" to either / OR index.html. I would suggest to **use ** Another suggestion: only use smallcaps and not CamelCase in URLs and folders (i.e. (/aboutus/aboutus.html NOT /AboutUs/AboutUs.html). You might even want to drop the folder altogether and just use /pagename.html
On-Page / Site Optimization | | Sebes0