Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • Can you explain why?  I've looked around a lot on this, and there doesn't seem to be much agreement about it.  https isn't costing anything with respect to performance, so why not do it? I'll happily remove it from all the pages where it isn't required if its really making a difference.

    | Kellster
    0

  • Do you know software to check texts for plagiarism? Copyscape or plagscan e.g.? Take "some off that content" from the original source on your site, check this using plagiarism software and take a look at the result... If it indicates, that these parts exist already than it`s DC... The software shows you similarities in percent... if it`s too high then the chances are very good that Google will regard it as DC as well.

    | dotfly
    0

  • Hi Martijn, I would say it would be about two weeks ago since I fixed up all the errors that appeared on the tool. I'm aware that they are scaling it down. It's a shame if so because I rank ahead of my direct competitors and they have the markup when appearing below me on the SERPS. I just wanted to make sure that it looks ok in the tool. Nothing appears in the preview pane.

    | MrPenguin
    0

  • your directories have duplication.  For example: http://www.titanappliancerepair.com/about-us.html and http://www.titanappliancerepair.com/about-us You may also need   RewriteRule ^(.*).html$ /$1 [R=301**,**L]

    | MickEdwards
    0

  • Just wanted to add my two cents and make sure you're all set. Pretty sure everyone here nailed it. The brand name is a little ambiguous and there are other things our there that could legitimately rank for that. It is a little curious how Google is not ranking it for the brand yet Binghoo is. One quick example, the Twitter account still links to their old domain - https://twitter.com/fsstudiodev - fix a whole bunch of little things like that and in aggregate they should add up to more clear brand signals. One thing I'll mention which no one else did ----> Rel Publisher - this connects the G+ page to the website and is probably THE most powerful brand signal you can easily send in Google's eyes. I would do that as high priority.

    | evolvingSEO
    0

  • One of the ways that we have created new pages is by building a landing page system, where we did have separate pages that targeted each of the groups of consumers that we wanted to use.  We created these pages at domains that were not on our main website, allowing us to target our home web page to all of those consumers and draw users in by targeting them specifically with the landing page. If you would like to hear more about our system check out the blog post we wrote about it at https://www.jtechcommunications.com/blog/blog-detail-11

    | JTsem
    0

  • You should be very happy that you're taking up more real estate in the SERP for a given keyword. They are of course "competing" - in the sense that every page in Google's index is being weighed to determine which is the best result for that query - but they aren't directly hindering each other. The last thing you should do is remove a well-ranking page. If you would much prefer one page to rank higher than the other, then I would find ways to pass a little relevance from page B to page A, such is with an internal link or additional optimization. Otherwise, remember that total clicks to 2 pages ranking lower on the page may very well be equal to or even exceed clicks to a single page that ranks higher.

    | David_Veldt
    0

  • Yes the new Moz Analytics does not give you the page that is conflicting like the old system. I hope they fix this soon.

    | Stephan_Boehringer
    0

  • Hi, I think this is a good question, and one of those where you should think on it from a user perspective. First of all, you want to have a URL that is: A) Easy to remember for the user B) Does the best possible effort of describing the content on the page, so that it's obvious to the user what the page is about, even before clicking the URL. In this post, Rand says that the URL should be < 90 characters: http://moz.com/blog/visual-guide-to-keyword-targeting-onpage-optimization Moreover, on this page you can see that there's a limit on URL characters in Internet Explorer(4-8, but there's also some limits in newer versions) which is 2083 characters: http://moz.com/learn/seo/url There's also some additional good information on that page, for your reading. For more information on the length from a technical standpoint, I'd recommend the top answer in this thread: http://stackoverflow.com/questions/417142/what-is-the-maximum-length-of-a-url-in-different-browsers This blog post indicates that most of Google's urls are around 70-90 characters, which may be a good indication: http://www.johnfdoherty.com/lessons-from-google-about-url-lengths/ As a final note, SERPs seems to only show around 50 characters in the URL fields, so this may also be something to keep in mind, however, they do tend to highlight the keywords and shorten the rest of the URL (for example www.google.com/.../insidesearch/howsearchworks/...). To summarize, I do not think there's a "definite best practice" defined in terms of URL length, as a general rule of thumb in these scenarios; as long as it makes sense to the user, it should make sense to the search engines too.  I hope I was able to help somewhat.

    | JSTRANDELL
    0

  • Cyto that should be interesting to see

    | maestrosonrisas
    0

  • Thanks again for the reply. One other question, if i wanted to link my plumbers page to 4 different areas pages would the correct way be to the same keyowrds, example: plumbers london, plumbers essex, plumbers surrey & plumbers berkshire as these are the phrases  wish to rank for? Or should i vary the anchor text? Cheers

    | Bossandy
    0

  • These are two different Search engines(bing, yahoo(powered by bing) and Google) and both have different Algorithms  of Rankings. I think you need to read this Post in details to get your Answer  http://www.searchmetrics.com/en/white-paper/ranking-factors-bing/ thanks

    | Asjad
    0

  • Thanks Jeff, guess i could link our from the site to the fuller explanation in the Blog Post? Ash

    | AshShep1
    0

  • I would organize reviews by most recent as it would show visitors what the latest thoughts are on the product and they can better learn what they are looking for, especially if the products have changed in anyway such as new formula, packaging, etc. A sampling would be fine for a testimonials-like page but might obvious to customers that it's not entirely truthful if they are all only good-great reviews on the product pages.

    | customerparadigm.com
    0

  • I have a suggestion, but please do your own research to verify this... My understanding is that you can set up a block of text on your site that is not visible until a visitor does something like clicks a tab or does a mouseover, and if you use the CSS command display:none; to make it invisible, that text will not be read by the major search  engines. That text can get displayed when the visitor clicks the tab or does a mouseover by having the CSS command switch to display:block; (or some other version of display). I think this is because a spider can't activate the tab or mouseover function. I've read this a few places and avoid doing display:none; on valuable content for this reason (better safe than sorry). But I have not read about someone using it to purposely hide content to prevent duplication issues or dilution of keyword density. It could verified with a pretty easy test. Just add some gibberish content to an existing page,  but hide it using display:none;  Then use GWT to ask for a recrawl, wait a few days, and do a search for the gibberish content. If it doesn't show up in the results, that is a good sign it worked. Let us know if you test it....

    | GregB123
    0

  • I am having the same issue currently with one of my clients sites. No robots.txt is or was ever in place. I can not find any issues at all that would cause the problem. Was wondering if you every had any resolution to this? Client also uses Yoast.

    | DavidFaltz
    0

  • Hi SearchParty, I definitely vote in favor of the latter - a single website with landing pages for each of the physical locations. With this method, the marketing you engage in will go toward building the authority of all locations. Mini sites with exact match domains tend to be candidates for thin and duplicate content, so I'm not a fan. Beyond this, I think they overly-complicate the scenario, when a single domain with strong content for each location keeps things simple for human visitors and the bots.

    | MiriamEllis
    0

  • Hi SearchParty, If these are all real physical locations to which customers come to do business or from which staff goes to serve clients, then my rule of them is to include up to eight of them in the footer, properly marked up using Schema. There should also be a unique page on the website for each location.

    | MiriamEllis
    0

  • Ian I don't think anyone is going to be able to help you without being able to see your site, do some checking around the SERPs, etc... If you don't want to provide that information, we understand, but I don't think anyone could provide you with a useful answer. As far as new sites going up and then dropping down in the first few months, we used to call this the "Google Sandbox effect". Nobody talks about it anymore for various reasons, but many of us believe that there is still an amount of time for a new site where Google will give it the benefit of a doubt in order to collect metrics about it from the SERPs, such as user's reactions to it appearing in the search results for different queries. Once that data is crunched the rankings often go down. That's just a theory, but I thought I'd throw it out there. It doesn't seem to make sense in this case though, as it doesn't explain the Moz metric drops, which are not associated with Google at all. Google recently cleaned house a bit on structured data in the SERPs, which might explain that part of it. Even if your data markup was using the correct syntax it wouldn't help if you don't have the trust metrics necessary for Google to deem the sight worthy of structured data in the SERPs.

    | Everett
    0