Category: On-Page / Site Optimization
Explore on-page optimization and its role in a larger SEO strategy.
-
Is there a tool that I can use to scrape and see metatags?
Thanks for everyone's responses! Matt-Antonino, I'll look to implement that solution. Appreciate it!
| Gavo0 -
Tags vs. Categories? What should I use?
Hi Shalin, Good news: you can do both! Assuming that it would allow you to segment content in a meaningful way for users. If tags won't make things better for users, I'd just go with categories for the sake of simplicity. But if it is useful for users, I'd do the following: Use categories as the primary method of organizing content, then leverage tags to provide further definition. But, here's the catch: as others have correctly noted, tag pages have the potential to produce thin content, so I'd recommend applying a noindex meta tag to all tag pages, as well as excluding it in the robots.txt file. If you're using one of the popular CMS platforms, like Wordpress, this should be fairly easy to do. This method provides the best of both worlds. You provide more ways for users to filter down to content they'd like to see and it's SEO-friendly because the tag pages--which may produce thin, duplicative content--are excluded from the index and crawl, and, therefore, should not present any SEO issues.
| trung.ngo0 -
Google Site Search & SEO benefits
Hi! I have to be honest and not sugar coat this, I have never seen any direct SEO benefit from using site search... You can use the data you get from it, but it's probably not the tangible benefit you are looking for.
| DennisSeymour0 -
Duplicate meta and title in Google Webmaster Tools not updated?
:insert animated gif from sandlot movie FOR-evv-err: Seriously, it can take a while for the pages to get removed. One week is not going to see the bot recrawl all those pages, most likely 1-2 months. I would resubmit the entire site using fetch as google, and resubmit all linking URLs. Go through your top level menus and resubmit them as well.
| David-Kley0 -
SEO value of old press releases (as content)?
Thanks for the thoughtful reply, Samuel. Definitely some good questions, and a few I hadn't already asked myself. I've made an effort to save press releases where there is definite long tail value. I also agree that point #2 about institutional knowledge is a big one. There are about 1,500 pieces of content in the audit and maybe 1/5-1/4 of that is press releases (dating back as far as 2006), so I won't have time to check all of them for external links, but that's definitely something I hadn't thought about, so I might have to figure out how to work some of that into the timeline. Thanks again.
| MilesMedia0 -
Fading in content above the fold on window load
Hi, For starters you could use the ‘Fetch as Google’ option in Webmasters Tools and see what your page looks like to search engines, or use a tool like browseo.net to do the same thing. Or you could make sure the page is indexable and link to it from somewhere and do a search for “one of your hidden text strings” (in quotes) to see if that content has been crawled and indexed. If you can’t see your content then you may have a problem, and as crawlers can distinguish between hidden and nothidden text it may be more than just blocking your content from helping you rank. It might actually look like you’re trying to stuff keywords into your content without showing them to the user. I think the easiest and simplest fix would be to remove the class which makes these elements invisible, and dynamically add that class with a little bit of jQuery just for users with scripts enabled: This way when a crawler (or a user with javascript disabled) visits your site they will be served the page with the content visible, with it only being hidden if the visitor is accessing the site with javascript enabled. Hope this helps, Tom
| TomVolpe0 -
Internal linking
Short answer yes. Google will only read 'x' number of links on a page anyway so too many and they wont even be read by Google. Too many internal links, like a link farm and this will be penalised. Also I would avoid keyword stuffing in your anchor text, as this will look spammy. Great article here on internal link building http://www.quicksprout.com/2014/05/14/how-to-avoid-getting-slaughtered-by-penguin-3-0/ Internal links is great way of helping your users navigate around the site, but use internal links to help people navigate your site and not purely for SEO. If you do it naturally and forget about the search engines you will be fine, its when you try and cheat them it will look spammy and you have the potential to be penalised. When adding a link, ask yourself: Does the user need this link, will it help them on their journey.
| Andy-Halliday0 -
Alt text / internal linking
Thanks so much for the responses. Agree that usability is most important. It's something we have always stuck to. Just want to make the most of internal links. But this is very helpful. Appreciated.
| HireSpace0 -
Designing path structure - readability or keyword density
Thanks for all responses, extremely useful.
| HireSpace0 -
What on-site issue could be causing Moz to not detect internal links?
Hi Everyone, I just wanted to follow up on this thread to see if I could provide any further assistance. I would be happy to look into your site if you could provide the URL that is giving you trouble.:) If you want to to work 1:1 with a help teamster I would recommend sending an email about this to help@moz.com. In the meantime if you have any other questions or concerns please feel free to ask. Have a great day!
| Sean_Peerenboom0 -
Best practice for URL structure - short and sweet, or double keyword?
Thanks Andy, great help!
| HireSpace0 -
Site not showing up in Google search since move
Xenu Link Sleuth, or even just a crawl test by Moz at https://moz.com/researchtools/crawl-test should help identify issues.
| KeriMorgret0 -
Why do some Keywords collapse in SERPS?
HI, Are you talking about the local serps? It might have been the new algorithm change that was recently updated. I have notice some updates as well.
| benjaminmarcinc0 -
Why will a Page not rank or improve on a website?
I would provide an example of one of the underperforming pages, and one of your pages that does well so we can analyze the differences and potential issues.
| David-Kley0 -
Google directs people to the wrong page on my site.
Thank you Cibble, I will look into getting those links.
| bradgmo0 -
Is the HTML content inside an image slideshow of a website crawled by Google?
This is actually really easy to test. Set up a basic version of each, and run the URL in this tool. Seo-Browser will allow you to see how your website is seen by a search engine bot. I have used this for TONS of sites, and never had it fail me when needing to see if something had to be changed. Once you copy and paste your URL in place, click the "simple" button. You can also sign up (it's free) to get more in-depth results. As long as you have live (meaning read-able and not image based) text that is crawlable in your slideshow, you should be fine. Try it, and test using the method above. Best of luck!
| David-Kley0 -
How does user behaviour signalled at Google affect rankings?
Google reps have said that the data from GA isn't used for ranking, and have specifically called out bounce rate as a very noisy metric. There are several good references in this Search Engine Roundtable post at http://www.seroundtable.com/google-bounce-rate-attacks-16203.html.
| KeriMorgret0 -
Duplicate content, which seems not to be duplicate :S
Okay. Thanks for the help. Will do that. Not for all pages (that's why we have a feed; around 100K products), but at least for the items that matter to us...
| Raymo0