Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • Thank you, I didn't think of the alt tag idea. I will try that. The awards on the bottom I put into the footer not on each individual page. Thanks again, Rena

    | HonestSEOStudio
    0

  • Thanks James, you know your stuff!

    | xlucax
    0

  • When using the canonical ensure you use the full URL. There are multiple protocols that can be addressed at a domain, for example, ftp, so including the http:// would be best practise. A side note about canonicals, if in the future you are planning on moving to https then ensure you update your canonicals. Depending on how you currently deploy your website; WordPress etc, you could use a simple php script to store a variable of protocol which will change globally if the file is updated.

    | mcncl
    0

  • Hi Donald, I'm using Drupal 7. I publish a lot of events for my geo-domain but I'm not really publishing "news" per se. I think I'll keep using CDN until I can somehow get AMP working with it and Drupal 7.  At times, I really wish I had gone down the WordPress path. Thanks for posting your experience.  I'm grateful. Steve

    | recoil
    0

  • Thomas, I agree with you about a copywriter's role and expertise. My point is that there ARE differences in the copy produced by a capable wordsmith versus a writer that understands and considers things like SERP features, semantic scope, mobile vs desktop experience, the role of supporting assets, etc. I've spent so much time massaging professional copy that, by the time it was passably optimized, I had basically done it myself. So yes, I already pay 2x for optimized web copy (and code). The problem is that_ half of that cost is my time_. I would definitely pay a premium for a copywriter with SEO chops. I digress... The question is whether decent web page / blog copy published via wysiwyg is any more or less successful, SEO-wise, than the same copy coded by hand (by which I mean foundational SEO, not ninja guru jedi sh*t). I'm asking a specific technical question; wysiswyg vs hand coding. There is clear consensus here that coding by hand (done well) has a better chance to rank on the Google. That's pretty obvious, really. That is not the thrust of the question. Good copywriters write good copy. Good SEOs do good SEO. Copywriting is tough. We ask these professionals to become experts in topics (and their page-level details) in a matter of just a few (billable) hours. On the other hand, we SEOs spend weeks, months, and years with our clients. We understand their market, audience, vernacular, and differentiating nuance. I don't envy the copywriters' challenge, but I will pay a premium for a unicorn who can do it all. ...I digress again... This is a technical question: What is the delta for the same copy produced via wysiwyg vs. by hand?

    | Jason-Rogers
    0

  • I have the simplest way to think about this. If the link is pointing out to the website you trust and you think your audience will find it useful, keep it followed. However, if the links are not trustworthy, atleast put a no-follow the link. Other then that there are Googe guidelines as you mentioned in your question. I think no following social media and wiki-pages is up to you as it will not make much of a difference in my opinion.

    | MoosaHemani
    0

  • Hi There! I would run your review schema through the structured data testing tool another time to double check that nothing has changed between when you implemented the schema and now: https://search.google.com/structured-data/testing-tool/u/0/ Keep in mind that Google gets to choose whether or not they want to show the review schema on the SERPs. So, your best bet is having the schema on your website and knowing that it passes the schema grades—but know that there is a chance it will not appear on the search results. Google chooses who it shows up for and when it shows up, which can be annoying but as long as you have it implemented correctly you're doing everything you can.

    | BlueCorona
    0

  • I would imagine that removing the "description" element would follow the logic of markup, but it is likely immaterial whether you choose to remove it or not.

    | Brandon. 0
    0

  • Thank you! I was leaning towards a redo of the URLs. I appreciate the input!

    | maghanlinchpinsales
    0

  • I think I might have found the problem. We're actually, using a 302 redirect from .com to .com/us/en/. From my understanding that doesn't transfer the ranking, right? We use a 302 redirect, because we have multiple stores: https://website.com/us/en/ > United States https://website.com/xl/en/ > Global Store https://website.com.au > Australia If an user visits www.website.com from Australia we send him to https://website.com.au. I'm thinking we should use a redirect 301 to the /us/en/ and keep the 302 for all the other ones. Does it make sense?

    | ederdesign
    0

  • Yes, thank you for that. Makes sense. I just really have to think about the best way to approach this.

    | MissThumann
    0

  • Thanks for all of your responses. I understand that the title shouldn't look spammy and needs to be relevant to the user. My question is, for example if I am writing the title for a women's scrub category page should the title be "Medical Scrubs For Women:Nursing Uniforms At Low Prices l Name of company" or should I take out the "at low prices" and put in some more keywords which may make it sound more like listing keywords? My thought was that if I have the extra space why not put in some text that will be enticing to the user to make them click. Please let me know your thoughts.

    | whiteonlySEO
    0

  • I was aiming for organically, and you bring up a good point, thanks!

    | pgmorgana
    0

  • Hi Brainfruit, I'm assuming that the content of your menu would still be visible in the source code of the page itself. As my best guess would be is that you're triggering this through the JavaScript on click. All of this shouldn't affect how Google will see your page. What is probably happening is that the page itself isn't crawled yet by Moz over time, making the Page Authority 1. Martijn.

    | Martijn_Scheijbeler
    0

  • Hi, The top and bottom line, is how many are needed. I have clients who dominate huge search terms and have pages with many hundreds of links - others who only have a small number and the only difference between the two is what is actually needed. If you have so many that the page is confusing, then I would reduce them. If you do have many hundreds, why do you have them? Can they be reduced without affecting usability? There is no one answer to this and each case much be looked at on its own merits. -Andy

    | Andy.Drinkwater
    0

  • Hi Stephen, Defintely use the first example! The second could be seen as keyword stuffing in the eyes of Google which would hurt your chances of ranking instead of helping them. The first way is also just much cleaner looking. You never want your URL to be too long.

    | BlueCorona
    0

  • Thank you. Patrick. 1 of 1 duplicate Here's the other http://thumannagency.com/list Here's the url http://thumannagency.com/locate I should mention we just updated our site, and google may not have reindexed it totally yet. (If that's how it work's - still learning..) Thank you:)

    | MissThumann
    0

  • Awesome, thank you so much for the detailed response and ideas - this all makes a good deal of sense and we really appreciate it!

    | paulz999
    0

  • yes i wrote the title wrong... i was meaning "related"... i don't get results if i put "related:www.pccdkeys.com" or "related:pccdkeys.com" it doesn't need a full stop, it works without it.... the problem is that i don't get any results... What i want is to find websites related to my website... not url's related to pccdkeys.

    | dos0659
    0