Questions
-
Robots.txt being blocked
Hi Matt, Thank you for checking back. I did change the robot.txt in the dashboard as people suggested but when I go here: http://brownieairservice.com/robots.txt It is still showing the disallow. I need to load this: User-agent: * Disallow: to the root folder and I'm not sure how to do that if I need to FTP it or how I do that so that's where I'm at now. Anybody have any thoughts? I have googled this question on how to do it and I keep getting put into this loop of information that does not address this questions directly. Thank you
Web Design | | SOM240 -
Wordpress site, MOZ showing missing meta description but pages do not exist on backend
What kind of CMS do you use? That might help figure it out. Sounds like your CMS is creating these garbage pages, whether or not they live on the backend or not. On the other hand, this is pretty common and it may not be something you need to worry about. Are the pages cached by Google? You can find out by pasting this search parameter in your browser: cache:http://example.com/url or a site: search for an overview of all your cached URLs. (site:example.com) If the garbage URLs aren't there, you probably don't need to worry much. On the other hand if you see pages of bad results, this might be something you need to address with your developer.
Intermediate & Advanced SEO | | Cyrus-Shepard0 -
Error reports showing pages that don't exist on website
I would avoid using the "remove URL" option in GWT. The 301s are more ideal in my opinion because let's say I have that old URL posted on my website somewhere, and now it's going to a 404 page. When you redirect it, people will be taken to a different page, and you don't have to worry about having me update the old URL on my website. The link will work, it will take you to an active page and can get you some traffic. However, the "remove URL" option won't give you this same benefit. Here's a helpful link straight from the source on when NOT to use the Remove URL option: https://support.google.com/webmasters/answer/1269119?hl=en
Intermediate & Advanced SEO | | Millermore0 -
Keep www in the domain or not?
I do have lots of experience personally and professionally on WWW and non-WWW websites. I guess you have expertise in WP and SEO, so giving only main points to determine without explaining the story of SEO and WORDPRESS. Lets consider the example scenario: domain.com occupies 9 characters in the search engine URL data www.domain.com occupies 12 characters in the search engine URL data (extra 3 characters) if you are very much serious about SEO and wanted to rank and optimize your site for search engine traffic better to remove WWW in the website URL Points to remember: 1. Inform google webmaster tools about your website (either www or non-www) 2. use .htaccess from www to non-www redirection 3. Use Wordpress Redirection plugin initially for monitoring what urls are being redirected and how If you aware of Canonical URL concept, please go ahead and use properly www.example.com example.com/ www.example.com/index.html example.com/index.php Note: As you might be using Wordpress, using SEO Plugin will help you in applying a Canonial URL When to use What - if you are looking for branding and not expecting too much search traffic use the extra WWW prefix if you are only interested to get search traffic and the website is not a brand (example: services, blog, keyword-rich domain) remove WWW Consider Examples like: moz.com yoast.com
Educational Resources | | SEMServices0 -
Has anybody used Yext or Universal Business Listings as an automated approach to getting clients into all of the many directories? If so does it work? Or does Google penalize in using these automated services?
Another wrinkle with using Yext. It appears that the links on many of the citations created by Yext, do not link directly to the business. Instead, they pass thru Yext first before being redirected to the business.
Inbound Marketing Industry | | flowsimple0