Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • Hi there, sorry for the long-delayed reply! Did you see the questions Dmitri asked? In order to help answer your question, we need this information provided as well. Thanks!

    | Christy-Correll
    0

  • The meta keyword tag isn't used by modern browsers (more info). It's only used by competitors to see what keywords you are targeting As Kay mentioned just keep it natural, if you feel its light then maybe you need more relevant content.

    | Axios_Systems
    0

  • Hi Neil, Open graph tags are different then normal meta tags. It is okay to use both. Open Graph Tags are specific for Social Media, while Meta tags are used for search engines. In some cases you may want to use the same information in them, while in others it would make sense to target your information for the specific platform. Here is a couple articles to help: on moz: https://moz.com/blog/meta-data-templates-123 on iaquire: http://www.iacquire.com/blog/18-meta-tags-every-webpage-should-have-in-2013 Hope this helps, Don

    | donford
    0

  • I'm glad to hear you've had good results. I'm about to give it a try myself, since it seems to be among the better out there. I've used 'Better Internal Links,' I think was the name of it, and it worked well at the time, so I'm hoping this one is even better. Despite what the other poster says about internal linking plugins (he doesn't seem to have used it, or any other, nor offer a better way than adding each manually, or really seem to know how they work) internal linking is smiled upon by Google, as you know, and if you have a ton of posts already that need links, it may well be impractical to add them manually. The plugin I previously used was very good about linking related content, and your page views should go up almost immediately.

    | musgrove
    0

  • TL;DR - Yes and no. There are two cases and i will be descriptive as possible. 1st case is if subfolder are linked from other pages. Like /subfolder2/page2 -> /subfolder1/ or /subfolder1/page1 -> /subfolder2/. In this case you link direct to 404 page and this can frustrate users and bots too. There is also some confirmation that nonuseful 404 page is low quality signal too: http://themoralconcept.net/pandalist.html (#10 in low quality signals). That's why you need to use crawlers and fix 404s - good for users and good for bots too. 2nd case is when users are curios. I'm one of them. Sometime. So let's say we have curious URLs: http://www.moz-team.com/randfishkin/article1http://www.moz-team.com/randfishkin/article2 http://www.moz-team.com/randfishkin/article3 http://www.moz-team.com/cyrussheppard/article1 http://www.moz-team.com/cyrussheppard/article2 http://www.moz-team.com/cyrussheppard/article3 as you can see urls are very clean and very descriptive. And now add curious user (pick me!) that can want to see more about Rand or Cyrus. This can be page with CVs, short bio or list of all their articles. So just editing URLs to: http://www.moz-team.com/randfishkin/ or http://www.moz-team.com/cyrussheppard/ using backspace. In perfect world this will give information... but in your case 404. And this is not good for users. That's why it's much better if you can create "category" page for each subfolder even if this isn't linked from other pages. This was explained many times as "silo structure": https://moz.com/blog/site-architecture-for-seo http://www.bruceclay.com/eu/seo/silo.htm http://www.stateofdigital.com/optimising-urls-seo-ux/ I hope that this answer will help. You MUST optimize site for users and bots too.

    | Mobilio
    0

  • Depending on how you cause the scroll to happen, Google might render the page unscrolled or scrolled.  Usually if it's done in the onload() function via Jscript, Google will execute that script and render the page as it is after the script is executed.  I've seen examples though where using JQuery's document ready function is NOT executed by Google to render the page. Test in Google Search Console, using Fetch and Render as Googlebot.

    | MichaelC-15022
    0

  • thanks guys and yep this it the template i am looking at for my own site - just trying to ensure it can do everything i need seo wise - i was told that as content was dynamic rather than static it wouldnt index as well?

    | neilhenderson
    0

  • Well the rest of their site which requires a login does, I doubt that a company like MailChimp is going to have any boost by switching to HTTPS for just their homepage. To state once again, HTTPS is just another very small ranking factor on top of the other hundreds they already have. It's just been in the news more than others, so in the case of MailChimp who's been an authority in e-mail marketing I doubt they'll see any big changes due to the fact that they do or don't support HTTPS.

    | Martijn_Scheijbeler
    1

  • Not surprising JonOS. If you've done a good enough job with the on-page, that's usually what's remaining.

    | DonnaDuncan
    0

  • Thought I'd provide a bit of an update on this one for you all, I add the gtin number to two different products one gtin was added as a product attribute (text and number) and the other was added with schema markup (Gtin14 - https://schema.org/gtin14). I've seen no ranking boost to either product on the keywords I'm tracking for them however, I have seen a small increase in traffic to the product which uses schema markup. From what I can tell from my analytics it would appear that some users actually search google using the gtin number! It seems as if gtin14 isnt widely used at present and as such I'm ranking in top spot. So i'm thinking of adding the gtin to all our products as a bit of a quick win to rank top for a small percentage of searches. i suppose it all depends on what products your selling and your user demographic as to wether your potential customers would ever search using the gtin? My personal view after some more reading (gs1 smart search, formally gtin on the web - http://www.gs1.org/gs1-smartsearch) is that gtin is going to become more prominent in the not too distant future, but hey I'm no expert. I'd love to know if anyone else has tested this out or if they try it and get the same results as I have?

    | Jon-S
    0

  • That's correct. In your HTML you have code as: [_this is the search icon and as it's created it fool some bots that there is file javascript(0) in current folder. So if i'm here: http://www.jasonfox.me/infographics/page/9/ then bot add file this as relative and full path became: http://www.jasonfox.me/infographics/page/9/javascript(0) and this is how 404 is make. Correct way is to replace "javascript(0);" with "javascript:void(0)" or with "#". Only this patch in WordPress theme (look around header.php) will stop 404s._](javascript(0);)

    | Mobilio
    0

  • Thanks, great answer - will update the template I think

    | wearehappymedia
    1

  • Hello Kory, How many products per page are you showing visitors? 12? In my experience, shoppers prefer to see more products on fewer pages. The best user experience would be to see them all on one page with more products progressively loading onto the page as the user scrolls down. This would be a "View All" canonical situation, which Google suggests for fast-loading websites. Since Magento isn't particularly fast, and because you have a very large catalog, I wouldn't advise a "View All Canonical" in your situation. However, you could certainly load more than 12 products per page. You could double that and cut in half the amount of paginated pages Googlebot has to load in order to find all of the products in each category. Another thing I recommend is putting static category introduction content on the first page, but not on subsquent pages, and to customize the Title tag on all pages by adding in the page number. These two things cut down the duplicate content risks. And of course keep rel next prev and your Noindex,Follow tags on paginated pages. All set?

    | Everett
    1

  • Honestly I'd be surprised if you were penalised for this. At worst, you may be a little hindered algorithmically but even then, if you've got sufficient, quality content on your site, a solid structure, good page titles etc then this should just be a minor detail at the end of the day. Talking purely about rankings, SEO is really just about sending overall signals and patterns these days; just make sure this isn't one of many risky signals. Of course, if you can fix it (maybe with the help of a dev) then it would be wise. If it were my site, my biggest concern would be the impressive this gives to the user. Does it feel like they're having keywords blatantly shoved in their face? If so, the issue becomes a little more important. Search engines don't buy your products, users do!

    | ChrisAshton
    0

  • Nope, unfortunately not. Donford's answer is the only way you can remove those sitelinks without hurting your SEO. Google wants to provide users with the best possible experience and sometimes this means providing them with links to the strongest/most used parts of your site directly from the SERPs. You can at least take it as a positive that they're showing up!

    | ChrisAshton
    0

  • Hi Jochen, SUPER interesting find, thanks for pointing this out, Jochen. To me, this looks like Google understands that these two pages are the same page, except for different devices, and is using information on the desktop page to make their search results more robust for mobile. You can see the connection by looking for Google's cache of your mobile page. The best way to do this is to search in Google for "cache:[URL]". If you search for "cache:http://m.avogel.ch/de/ihre-gesundheit/erkaeltung/alles_ueber_erkaeltungen.php", Google will send you to the desktop version of the page. Here's my theory: Google has one index for both desktop and smartphone users, so it combines data and gives the user the best result possible. Google's doing more and more to try to improve its search results even without SEO intervention, so I'm not too surprised about this, but can't seem to find this in any SEO articles out there. In answer to your question: I recommend that you continue to keep you mobile and desktop sites similar enough that Google is pulling from both. In the past, some SEOs would build sites differently for mobile users, but I've never seen any UX studies that shows that that's a better approach. Given that Google strongly recommends that you use responsive web design, it's certainly not Google's recommended approach. I hope this helps? I'm not sure if this was a post because you were worried about something - this seems like good news to me! Kristina

    | KristinaKledzik
    1

  • The dating of content has gotten a lot of play of late, in large part because of a post that appeared on the Moz blog: https://moz.com/blog/case-study-can-you-fake-blog-post-freshness- Despite the dissenting opinions on all sides, this much appears to find consensus: Update your content and the content's date when you have new, valuable information available to provide, and only then. Otherwise the results are likely to be short-term and not very worthwhile. RS

    | ronell-smith
    0

  • Did you try Yoast SEO Sitemap or All in one. Yoast works fine for me. Never had any issue.

    | Verve-Innovation
    0

  • I'll add this article by Rand that I came across too. I'm busy testing the solution presented in it: https://moz.com/blog/are-404-pages-always-bad-for-seo In summary, 404 all dead pages with a good custom 404 page so as to not waste crawl bandwidth. Then selectively 301 those dead pages that have accrued some good link value. Thanks Donna/Tammy for pointing me in this direction..

    | dsumter
    0