Questions
-
Moz Crawl Test error
Hi Zoe, I just tried to view your Robots.txt file and was unable to access it. So I requested a crawl. The error I got from Moz is "612 : Page banned by error response for robots.txt." I would suggest you look at the file permissions of your robots.txt file. It should be readable by the public; Apache permissions would be (644). This means it is not available to view or it could be that it is in the wrong directory or non-existent altogether. It should be in your root and accessible via a browser here: http://www.guitarcontrol.com/robots.txt I would also note that some people feel that listing your sensitive directories in the robots.txt file is a security risk, and perhaps they are right. However, there are other ways to secure a directory and keep them out of the Indexes, and if the information being protected is that important, I would imagine that all security precautions would be taken. The robots.txt file is important for crawlers so it should always be used and available to be read. If security is the problem here, I would point to some high profile sites that don't seem to have a problem with showing the public their robots.txt file. https://www.google.com/robots.txt https://www.facebook.com/robots.txt I hope this helps, Don <colgroup><col width="907"></colgroup> | 612 : Page banned by error response for robots.txt. |
Moz Tools | | donford0 -
Best way to create robots.txt for my website
Hi, First you need to understand your website need, you have to decide which part of your website should not be indexed or crawled by SE bots, like your website provides user login and user areas, if you are providing private dashboard for your user then it should be blocked by robots.txt (or you can use meta tag to prevent robots from crawling and indexing your particular page like ) or you can learn more about robots.txt here https://moz.com/learn/seo/robotstxt Hope it helps
Technical SEO Issues | | rootwaysinc0 -
How I can improve my website On page and Off page
As Matt mentioned above, this is very vague and would require a good amount of research. There are many tools and blogs available that help new and old sites begin the process of optimizing a website. I can give some advice on what I would do to get the initial setup going. Site audit - Run a site audit and find where issues may be. This means search for broken links, broken images, missing tags, duplicate content, etc. To begin ranking a site it's important to make sure the site is running on search engine's requirements. Review your link profile - Do you have spammy links to the site? Is the competition blowing you out of the water when it comes to links? Given your niche, there's going to be lots of competition, chances are you are far behind in the link area. Blog - In many cases this is a given, however, many people don't understand the importance of this step. Add new, compelling content people will want to link to. Research long tail terms people may be using to find products. This will give you a better chance of ranking rather than generic terms like 'guitar' which will have incredibly high competition levels. Syndicate your content - You've posted a blog, now what? Get it in front of people! Share it on social media and social bookmarking sites. Back link - I don't need to go over the importance of this step. Just remember, quality over quantity. I hope this will help you get started, there's lots to do, but stay vigilant. -Nick
Intermediate & Advanced SEO | | Chris_Hickman0