Category: On-Page / Site Optimization
Explore on-page optimization and its role in a larger SEO strategy.
-
Robots.txt file
As Ryan said, robots.txt file is very useful when you wanna block (disallow) some pages. Indeed, if you don't want that spider crawls your page you must use robots.txt (noindex tags will let bot crawls, but not index, your page). I have got a small website but i dropped robots.txt in my folder. Maybe write just Allow: / could be useless, but you can say: "I respect protocols"
| Greenman0 -
References and SEO?
Hmm the question isn't very clear. The question only you or your team can answer is "are these the keywords we want Google to rank us for"? If yes, then no problem. If not, then you can do several things: Block the reference pages from being crawled using the meta robots tag Submit a sitemap so Google better understands your site structure Revisit your on-page SEO to make sure your html reflects the keywords you want to rank for Restructure the site so that (1) your authoritative content is more prominently linked and (2) the reference material comes after the real content in the page markup.
| AdoptionHelp0 -
Installing a site on top level domain directory VS deeper directory
Without server configuration, I'll take a stab at this. I see only one potentially significant issue: Inbound links. When someone links to the website, do you want to count on them linking www.site.com instead of www.site.com/catalog ? Shorter URLs are easier to remember and they can improve the direct type-in traffic. For navigational purposes, your server can resolve to the site homepage when the incomplete URL is entered into the browser, but a 301 redirect doesn't carry over all of the PR. Less significant is the superfluous term in your URL. Legend has it that shorter, simpler, more relevant keywords in your meta-data and URLs is the way to go. Will a single additional term drop you in the SERP? Highly unlikely. Most ecommerce sites have to use a similar format. If you don't have reasons of necessity, go with the short URL. 301 redirects (URL iterations 2,3,4,etc to a single URL format) should suffice for those site that have been live for some time.
| RDK0 -
Google Place Pages - Definitive Guide?
I believe the recommendation is to have the title only be strictly the business name (with no keywords in it.) As long as all the offices are real locations and you've verified them either by phone or mail, I don't think that's ever been an issue. You may want to do other things to differentiate the locations from one another, such as unique photos, unique names of staff members and so forth. And if each location has its own page on your website that's even better. -Dan
| evolvingSEO0 -
Creating optimized content: how to standardize the process?
Thank you so much Lewis, this is especially true after the Google Panda update. Thank you so much for the answer! Do you think is the same also for Bing and Yahoo! ?
| YESdesign0 -
Should I fix 302 redirects?
302 re-directs are a temporary solution. If the re-directs are permanent then change them to 301s
| PeterAlexLeigh0 -
Meta tags disappeared in google?
Hi Steve, With regard to your meta description: <meta name="description" content="Fight a Denver Traffic Ticket. Ticketvoid fights your speeding ticket with the best Colorado Traffic Attorney. Beat traffic violations with a Denver Lawyer." /> Unfortunately, there is no guarantee that Google will use the meta description that you specify. While it's frustrating, I have seen several instances where Google selects alternate text to use in place of the specified Meta description. Best, W
| Wayne760 -
How long after a URL starts showing a 404 does Google stop crawling?
Thanks for the responses. I tend to agree, but my concern is the "permanent" nature of a 301. I assume they will re-crawl the old URLs and replace the previous redirects with the new ones eventually.
| AndrewMiller0 -
Page title getting cut off in SERPS even though it's under 70 characters?
When did you make changes to the title tag? The new title tag will appear in the SERP only when the page is re-indexed by Google.
| OptimizeSmart0 -
Too many on page links
Tried to set up my campaign to look at the root domain, but it said that my website is set up as a subdomain automatically converted this step of the process to "subdomain". I believe this would be due to either the root domain being redirected to the subdomain, or there not being any content on the root domain address. If another mozzer does not offer a definitive answer, try contacting the SEOmoz help desk for more information. When you say "choose" one version of my site, do you mean as far as what SEOMoz crawls? Or are you suggesting I make a change to the site itself? The site itself. You have one website and it's content should only be available via 1 address. Think of it this way. You can create a website at the address "mysite.com". Next, you create another site at the URL "www.mysite.com" which is an exact duplicate of the "mysite.com" site. This is exactly what you have done. You can even repeat the process further and create subdomains such as "www1.mysite.com". Each subdomain is a duplicate of the main domain and causes confusion for users and search engines alike. Resolve this confusion. Choose ONE way to present your site and remain consistent. The blog is, I believe, set up as a subdomain (with www.blogs.aerohive.com) and it is hosted by a third-party. The URL "blogs.aerohive.com" is indeed hosted elsewhere, but the URL "www.blogs.aerohive.com" is hosted at the same location as your main site. It is a mirror of your main site. Remove this subdomain. I am trying to understand what was set up correctly or incorrectly within SEOMoz and what can fix and what I can fix with my website Presently there are two major issues which need to be resolved. Both issues are with your website itself, not the SEOmoz tools. If you have managed hosting, the easiest step is to call or open a ticket with your hosting provider and make two requests: 1. Add a 301 to redirect all non-www traffic to it's www equivalent 2. Delete the www.blogs.aerohive.com subdomain You should be able to copy and paste the above two requests and paste it into a ticket. Your hosting provider should completely understand what actions are necessary.
| RyanKent0 -
What is wrong with this site
Hi Atul. Shelly offers a great response and gets a thumbs up from me. A few points of feedback to add in addition: the site is currently duplicated. It is available in both the www and non-www URL format. This major issue needs to be addressed The site is social with facebook. Add google+ and twitter as well. Social sharing can be added to each and every "product" which in this case is the papers. For the home page, I dislike seeing index.php. It is not helpful to users nor search engines. Which URL looks nicer to you? http://allkindofessays.com/index.php or http://allkindofessays.com/ The GoDaddy SSL certificate is fine, but the "SSL Certificates" text underneath it is totally unnecessary and I would remove it. On your home page the sidebar has "Subjects" and the exact same links and subjects are shared in the main page area. I would suggest one or the other but not both. Shelly mentions a XML sitemap but I would recommend an HTML sitemap for users, not XML. The XML sitemap is great for search engines, not users. Good luck.
| RyanKent0 -
Blog On Domain Or Off?
Thanks to all for the great feedback. I only wish I had read this last week. Unfortunately I created a subdomain. Unfortunately our primary domain is fairly antiquated and not conducive to blogging/social sharing, etc. (legacy to my arrival) so I guess in the short/intermediate term I had no choice until we overhaul the website next year.
| Timmmmy0 -
What about that stuffed footer?
SEO for large websites is different than SEO for small or mid-sized websites. Zappos.com has over 6 million pages indexed by Google. Having 200 footer links is very helpful for the search engines in this situation. Go back to the days of, "every page in 3 clicks". It's still a sound policy as it makes it easier for the search engines to not only find pages, but the pages can still have enough Page Rank to be viable and not supplemental. I picked several unusual shoes (well, at least unusual to me). For each shoe I was able to get to it's specific product page in 3 or 4 clicks. Even though some shoe brands or styles have paginated listings going 30+ deep - Zappos.com has found a way to provide a much quicker path to the product pages. Rottentomatoes.com also has a slick footer navigation that gets to any movie within 3 clicks.
| NebraskaChicagohh0 -
Campaign onpage ranking different then tool onpage tool
Hi Tracy, You can use the On-page tool to optimize for any keyword term as long as it is loaded into the keyword list in your campaign. By default the On-page tool offers a report card for every keyword already loaded that it finds relevant to your home page. You can create Report cards for individual pages and keywords by digging a little deeper into the tool. There is an explanation of how to use the tool for specific pages in this recent thread about SEOmoz On-page Tool issues. If you have not specifically selected the URL of that page (as described in the other thread), then the B Report Card you are seeing would be the Grade for that keyword on the Home Page of the site, not for the specific page you have optimized. If this is not the problem and the tool appears to be malfunctioning, then the best thing would be to contact the Help Team using help [at] seomoz.org.Let them know the details of the specific campaign and the issue so they can take a look for you. Hope that helps, Sha
| ShaMenz0