Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • Thanks for everyone's responses! Matt-Antonino, I'll look to implement that solution. Appreciate it!

    | Gavo
    0

  • Hi Shalin, Good news: you can do both! Assuming that it would allow you to segment content in a meaningful way for users. If tags won't make things better for users, I'd just go with categories for the sake of simplicity. But if it is useful for users, I'd do the following: Use categories as the primary method of organizing content, then leverage tags to provide further definition. But, here's the catch: as others have correctly noted, tag pages have the potential to produce thin content, so I'd recommend applying a noindex meta tag to all tag pages, as well as excluding it in the robots.txt file. If you're using one of the popular CMS platforms, like Wordpress, this should be fairly easy to do. This method provides the best of both worlds. You provide more ways for users to filter down to content they'd like to see and it's SEO-friendly because the tag pages--which may produce thin, duplicative content--are excluded from the index and crawl, and, therefore, should not present any SEO issues.

    | trung.ngo
    0

  • Hi! I have to be honest and not sugar coat this, I have never seen any direct SEO benefit from using site search... You can use the data you get from it, but it's probably not the tangible benefit you are looking for.

    | DennisSeymour
    0

  • :insert animated gif from sandlot movie FOR-evv-err: Seriously, it can take a while for the pages to get removed. One week is not going to see the bot recrawl all those pages, most likely 1-2 months. I would resubmit the entire site using fetch as google, and resubmit all linking URLs. Go through your top level menus and resubmit them as well.

    | David-Kley
    0

  • Thanks for the thoughtful reply, Samuel. Definitely some good questions, and a few I hadn't already asked myself. I've made an effort to save press releases where there is definite long tail value. I also agree that point #2 about institutional knowledge is a big one. There are about 1,500 pieces of content in the audit and maybe 1/5-1/4 of that is press releases (dating back as far as 2006), so I won't have time to check all of them for external links, but that's definitely something I hadn't thought about, so I might have to figure out how to work some of that into the timeline. Thanks again.

    | MilesMedia
    0

  • Hi, For starters you could use the ‘Fetch as Google’ option in Webmasters Tools and see what your page looks like to search engines, or use a tool like browseo.net to do the same thing. Or you could make sure the page is indexable and link to it from somewhere and do a search for “one of your hidden text strings” (in quotes) to see if that content has been crawled and indexed. If you can’t see your content then you may have a problem, and as crawlers can distinguish between hidden and nothidden text it may be more than just blocking your content from helping you rank. It might actually look like you’re trying to stuff keywords into your content without showing them to the user. I think the easiest and simplest fix would be to remove the class which makes these elements invisible, and dynamically add that class with a little bit of jQuery just for users with scripts enabled: This way when a crawler (or a user with javascript disabled) visits your site they will be served the page with the content visible, with it only being hidden if the visitor is accessing the site with javascript enabled. Hope this helps, Tom

    | TomVolpe
    0

  • Short answer yes. Google will only read 'x' number of links on a page anyway so too many and they wont even be read by Google. Too many internal links, like a link farm and this will be penalised. Also I would avoid keyword stuffing in your anchor text, as this will look spammy. Great article here on internal link building http://www.quicksprout.com/2014/05/14/how-to-avoid-getting-slaughtered-by-penguin-3-0/ Internal links is great way of helping your users navigate around the site, but use internal links to help people navigate your site and not purely for SEO. If you do it naturally and forget about the search engines you will be fine, its when you try and cheat them it will look spammy and you have the potential to be penalised. When adding a link, ask yourself: Does the user need this link, will it help them on their journey.

    | Andy-Halliday
    0

  • Thanks so much for the responses. Agree that usability is most important. It's something we have always stuck to. Just want to make the most of internal links. But this is very helpful. Appreciated.

    | HireSpace
    0

  • Thanks for all responses, extremely useful.

    | HireSpace
    0

  • Hi Everyone, I just wanted to follow up on this thread to see if I could provide any further assistance. I would be happy to look into your site if you could provide the URL that is giving you trouble.:) If you want to to work 1:1 with a help teamster I would recommend sending an email about this to help@moz.com. In the meantime if you have any other questions or concerns please feel free to ask. Have a great day!

    | Sean_Peerenboom
    0

  • Xenu Link Sleuth, or even just a crawl test by Moz at https://moz.com/researchtools/crawl-test should help identify issues.

    | KeriMorgret
    0

  • HI, Are you talking about the local serps? It might have been the new algorithm change that was recently updated. I have notice some updates as well.

    | benjaminmarcinc
    0

  • I would provide an example of one of the underperforming pages, and one of your pages that does well so we can analyze the differences and potential issues.

    | David-Kley
    0

  • Thank you Cibble, I will look into getting those links.

    | bradgmo
    0

  • This is actually really easy to test. Set up a basic version of each, and run the URL in this tool. Seo-Browser will allow you to see how your website is seen by a search engine bot. I have used this for TONS of sites, and never had it fail me when needing to see if something had to be changed. Once you copy and paste your URL in place, click the "simple" button. You can also sign up (it's free) to get more in-depth results. As long as you have live (meaning read-able and not image based) text that is crawlable in your slideshow, you should be fine. Try it, and test using the method above. Best of luck!

    | David-Kley
    0

  • Google reps have said that the data from GA isn't used for ranking, and have specifically called out bounce rate as a very noisy metric. There are several good references in this Search Engine Roundtable post at http://www.seroundtable.com/google-bounce-rate-attacks-16203.html.

    | KeriMorgret
    0

  • Okay. Thanks for the help. Will do that. Not for all pages (that's why we have a feed; around 100K products), but at least for the items that matter to us...

    | Raymo
    0