Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I have not seen any studies indicating such a thing, (but my guess is that is that dwell time seems to be such a strong signal of relevance that google would never release that info, I could be totally wrong though) An idea to improve UX... if you have a page with 2 paragraphs of text, take the average  time it takes for 10 ppl in your office to read it and set the 'bounce rate' accordingly.  Then you'll know if ppl are reading it. If you have a page with 2000 words, avg. that time, etc. If visitors bounce too soon, edit the text until your office avg. meets visitor avg.  That would equal relevance right?

    | IOSC
    0

  • You have a very typical case here, Studio33. The usual way to handle multiple sitemaps most efficiently is to create a sitemap index file at the root of the site which would list the locations of the two actual xml sitemaps you have. Then in both Bing and Google webmaster tools as well as your robots.tx file, you point to the sitemap index file. This way, you're giving the maximum number of signal for where the engines can find your sitemaps. (A sitemap in internal directories may not be found by search engines - they're trained to look for a standard sitemap file in a standard location at the root of your site - no sense making it hard for them!). Here's Google Webmaster Tools' help doc on making a sitemap index file. Hope that helps? Paul

    | ThompsonPaul
    0

  • Yeah, internally, "nofollow" still dilutes PageRank, so it doesn't really have a big impact. Practically speaking, I don't think there would be much difference. The links have a customer-facing purposes, I assume, and you just don't want the end result indexed. That should be fine. The only advantage to nofollow might be that it would waste a little less crawl bandwidth (instead of bots visiting the page and then seeing it's blocked, they wouldn't bother). I've tried it both ways in the past (after Google changed the PR-sculpting rules), and could never really measure much difference.

    | Dr-Pete
    0

  • Thanks Guys. Points well taken. I considered merging simply because both sites sell basically the same products, so making each unique would be somewhat difficult. My site (A) is much larger, better optimized, ranks well and had much more content. Site B has many products but little content which I think is why it doesn't rank quite as well. I am concerned about how much work it will take to get B up to speed when the same amount of work into A would probably generate more revenue. Just wondered if putting the work into A, redirecting B to A might be the best game plan. But maybe not long term.

    | residualboulders
    0

  • Hi, On blogs I usually don't use or advice to use the brand or domain name as blog titles are usually long - adding the domain / brand will only make them even longer. If your blog makes a good job as far as branding - no need to put that in your Title. As far as keywords - that is always a good idea - but again - you should use your main keyword that is targeted with a particular post in your H1 and use the h1 also as the title tag - that can cover the basic setup. Anyway, I would rather not use the keyword in the Title tag / h1 if it dosen't fit and if it's not really readable like Jesse also mention. Also the keyword, to "maximize" the SEO effect should be as much as possibile closer to the beginning of the title. You can add for some - the hype formula for keywords - Use the "keyword: comment" format. So if your main keyword is Iphone and your post is actually called "the new model of Iphone" you can move the keyword in front using: "Iphone: the new model" - stupid sample but hope it sends the idea Hope it helps.

    | eyepaq
    0

  • That looks like it should do the trick. Thank you!

    | SCW
    0

  • We have a CMS that automatically generates the meta description and keywords for our pages and I hate it. Not sure why we even have it and can't get around it. We have to override it each time to get sensible tags, therefore it isn't worth having. Avoid it and write your own tags manually.

    | Houses
    0

  • Yes, shouldn't be an issue. However, you can check image meta data through http://metapicz.com. I am not sure you can batch these up.

    | KevinBudzynski
    0

  • Bad, the sitmap is telling SE's to index, the noindex is telling them not to. Bing for one will ignore your sitemap if it is more then 2% incorrect. As chis said, why noindex, this means that all links pointing to the no-indexed pages are wasting their link juice

    | AlanMosley
    0

  • Then if they are linked that that's ok, but if you change the title of the other pages,, that may stop them ranking but it wont help your target page rank.

    | AlanMosley
    0

  • No, you don't have to rel canonical one page to another if you are already 301ing that same page to the other.

    | Chris.Menke
    0

  • Load times and compatibility are the only real concerns. Neither of those should be a problem if the sites still works well and looks good without the video loading. I've seen this working really well on some sites (admittedly quite badly on others).  It's a great way to make a site stand out when done well though and that can be an SEO win in it's own right.

    | matbennett
    0

  • My recommendation was just that. Take away all excessive linking with subcagegories and even primary categories and instead add more generic, but traffic hub important footer links. This is how I "always" recommend to clients, but since all of the panda updates and the external SEO partner (who is responsible for the links in the first place) protested, I just wanted to get some pro input from all of the SEO pros here at moz

    | Macaper
    1

  • Thank Tim. The search part of the site integrates from a 3rd party so it is hard to get them to do anything. bUt as you say  if it is not hurting then all ok

    | ingageseo
    0

  • Ok I see its a new update!!!!! I think everyone is that way.

    | Joseph-Green-SEO
    0

  • Thank you both, I appreciate it.

    | PeterRota
    0

  • Thanks Tom.... Very valuable info. A bit of background: we are running an e-commerce site - so I'm dealing with various forms of dupe content: Descriptions that are copy pasted from manufacturers - e.g, replicas elsewhere - we are addressing this by writing our own content. Dupe content between pages due to the products only change in colour or size - I am going to make these configurable products, so group them into one page. URL params - this is a nightmare! Basically faceted navigation that is causing the same page to show very near similar content under different URL's. Anyway, slowly fixing these - just seems like Google is painfully slow to respond to our work on improving things. Hence the question...

    | bjs2010
    0

  • If you are talking about the meta robots tag below, Google shouldn't index any pages set to noindex no matter where the link comes from.

    | GeorgeAndrews
    0

  • Check out Quote Roller’s SEO proposal template here http://www.quoteroller.com/proposal-templates/seo-proposal-template/ And I suggest  you to give Quote Roller (www.quoteroller.com) a try – it has 14 day free trial and actually is really nice tool for sending and tracking business proposals

    | eugenezaremba
    1

  • Hey, I think I answered the question. Go to this page: http://www.seomoz.org/article/search-ranking-factors Click on any of the tabs. Scroll a little bit and select a sentence and search for it on Google  You will see that Google gives the right URL, however it doesn't highlight the text on SERPs. I do believe there is less weight given in that case, since google doesnt even seem to highlight the sentence  In the case of long tail keywords, it will pick the destination that delivers the intent right away. In the case of hidden / non visible divs, there are more steps required to get there.

    | sophia123
    1