Category: On-Page / Site Optimization
Explore on-page optimization and its role in a larger SEO strategy.
-
Popup windows are coming up as 404 error in moz reports.
Hi Ben, Sorry for the late response - hope the question is still relevant to you. Short answer - yes, this could cause a problem, depending on how many 404's you are creating. The first thing you want to do is check with Google Webmaster Tools. Look at your Crawl Error report and see if Google is also reporting these URLs as 404'ing - my suspicion is that they are. Is it just one URL that is 404'ing, or several? If it's only one, that you realistically don't have much to worry about, but it's always best practices to fix it. Solutions: 1. Remove the 404. Let the page load in it's own browser window if need be. If you want to keep the page out of search results, there are better ways to do it. 2. Add a meta robots "NOINDEX" tag to the popup page. This will keep it out of the index. 3. After the page is gone from the index, add a nofollow to the link, or block the directory using robots.txt. Regardless, there are quite a few ways to tackle this. Hope this helps. Best of luck with your SEO!
| Cyrus-Shepard0 -
Encouraging users to "like" or "+1" our pages
On my big traffic pages I have "Do us a favor, if you like our tools give us a "+1" or a "like" "
| cbielich0 -
Domain registered location - SEO
Still it seems a minor factor (as H1 for On Page): the whois data can be turned to Private in order to not expose private information to the public knowledge (that means also to Google) Google itself says this: Server location (through the IP address of the server). The server location is often physically near your users and can be a signal about your site’s intended audience. Some websites use distributed content delivery networks (CDNs) or are hosted in a country with better webserver infrastructure, so it is not a definitive signal. What it told in that post as a truth, really seems just a gut sensation and it is not confirmed by any scientific test. Totally different thing is having a local address, which is considered an important factor both for Local Search (Google Places) and Geotargeted Universal Search.
| gfiorelli10 -
Client needs a basic page analysis tool
I would then come up with a small SEO plan that does one report & charge them for it... Don't leave money on the table, charge them a flat rate per report & when they have a bigger budget they will hire you when you have kept in contact with them. Don't send clients away, today's small client might become tomorrow's large one.
| Mcarle0 -
SEO for standard website pages
The about and contact pages could be good for reputation management. Write about your staff, and you have one more positive result for their name in a SERP.
| KeriMorgret0 -
Newbie with a few questions
Ok, I know, sorry, These site links are selected by google, you only can select manually which one you don't want to show using Web Master Tools.
| cfguti0 -
An ecomerce seo question
Hi EGOL Thanks for the reply, your idea about 'uniquifying' (I feel I may have made a word up there hehe) the product pages is certainly one I have been considering. It does raise a couple of issues however. Firstly, I would need to get a considerable amount of data on the products to see which ones are selling. This would need to be long enough to remove any seasonal variations in the stats. The second issue is that the affiliate products pull from merchant data feeds so any changes I make to the content will get overwritten. The idea of uniquifying (if I use it enough it might become a 'proper' word!!!) certain pages is one I am open to doing on my smaller sites but for the larger sites it could pose some issues. An example site can be found at www ukchic co uk - have taken out the dots incase people thought I was posting the url for a cheeky backlink
| Grumpy_Carl0 -
On Page SEO Tool
A desktop crawler will do most of that, too - Screaming Frog is a great option, but it's a paid tool over 500 pages (I think). I wrote a post last year comparing it and Xenu, another crawler: http://www.seomoz.org/blog/crawler-faceoff-xenu-vs-screaming-frog
| Dr-Pete0 -
I have one page on my site... but still get duplicate name and content errors.
Could be that your header is linking to the index.html instead of just the root URL.
| KeriMorgret0 -
Long meta description
Putting nothing actually isn't always bad these days. If the pages are clearly unique, Google can create a snippet with no trouble. In fact, they often do this anyway (regardless of your META description). Most people prefer some control over the snippet (you never have total control), but I've seen cases where leaving a META description off worked fine. There really isn't much benefit to going beyond the length limit - it's not a ranking signal and Google will only display up to the limit. If you had a long META description, it's possible Google would display a middle section of it if that matched the query, but in most cases I wouldn't bother. You're just using up load-time for something very low value. Presumably, that text is also on the page somewhere. All of this is to say that, while I'd lean toward the truncated version, I don't think it's cut-and-dry. I'd actually say the long version is my last pick in most cases. As @Boomajoom said, it could be a spam signal (although probably only if its keyword-stuffed).
| Dr-Pete0 -
With or without the "www." ?
Deviating slightly on the top here but I would say that link inclusion on social sites you should use services like bit.ly and not paste in the URL. My reasoning for this is what with a bit.ly url if you add a + at the end you can see statistics for that particular link (how many clicks its had etc), which is nice and simple and saves crawling through Google Analytics to answer some simple fundamental questions. In email signatures, leaflets and printed promotional material (where your typically short on space to use) then I agree it does make things shorter and look nicer, and who know maybe it will catch on and more and more people will start removing www. from their domains and it will then become more of a standard, for which Google and other search engines will probably use as a possible ranking factor. I must admit this has been a great discussion on this topic.
| blacey0 -
SEO for Image only posts
About the CSS: i have no idea, but i noticed it in other posts ... What browser are you using? If you're afraid of keyword stuffing, you could: remove the title-tag from TEXT-link include both IMG and TEXT in one link with one title=tag include both IMG and TEXT in one link without title=tag, only ALT-tag If the image also serves as a link you could potentially include both ALT and TITEL tag. But with a TEXT-link beneath, perhaps you would be keywordstuffing. SEO wise (both IMG and TEXT) it would be best to use one ALT and one TITLE per image/link combo.
| alsvik0 -
Fixing large number of 404s
Right on......use one of the plugins for wordpress. We use google xml sitemap. Works like a charm. After you have added content go in and recreate the sitemap and you are in business once they recrawl......make sure you clean up other issues. Another plugin that might be helpful is wordpress importer. Here is the copy and paste from the plug in on one of our sites..... The WordPress Importer will import the following content from a WordPress export file: Posts, pages and other custom post types Comments Custom fields and post meta Categories, tags and terms from custom taxonomies Authors For further information and instructions please see the Codex page on Importing Content
| Mark_Jay_Apsey_Jr.0 -
How to stop downward drift
Also, don't wait for google to tell you if you have duplicates. That is what I was doing - silly me! I did a comprehensive headline check and I discovered that there were more than a thousand duplicates in our system because back a few years, one of the editors was doubleclicking the mouse when publishing stories and that was before I put in a mechanism to prevent them. I thought there were only occasional ones and I'd fixed them, but it seems I didn't! So check every page on your site for duplicate titles, duplicate descriptions and duplicate content.
| loopyal0 -
Does Google pick up on words such as "in", "the", "and" etc?
There was a time when Google simply ignored so-called "stop words", for processing efficiency, and so they two queries in your example were essentially the same. It looks like that has changed over time, though. See this post from 2008 by Bill Slawski (an expert on Google patents and technology): http://www.seobythesea.com/2008/01/new-google-approach-to-indexing-and-stopwords/ ...and a quick experiment someone did in 2010 that seems to confirm that: http://www.dougwilliams.com/blog/seo/stop-words-does-google-ignore-these-anymore.php In my experience, it's a bit specific to the query and competition. In many cases, the addition or subtractions of a stop word may not make much of a difference, but in your case it probably does. If the term you want to target is "Holidays in Ireland" and the Top 10 for that term seems different from the shorter term, I'd say to use "in". I'm seeing some differences between those two sets of Top 10 results (not huge, but some).
| Dr-Pete0 -
Why do I suddenly have so many more page errors?
Hey, thanks! I'll talk with my programmers to see what's up.
| janettapp0