Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • I really like Dana's response - it covers the primary consideration - how much time would it REALLY take to write unique Meta descriptions? If the TRUE answer is "unrealistically too much time", then a template COULD work. The trick though is addressing the issues Dana talks about. If you only use a primary product name as the variable, you run risks. If you have a 2nd database field you have that includes some differentiation between otherwise identical products, that can help.  As long as you understand total length as a consideration.

    | AlanBleiweiss
    0

  • Jason - I took a look at the Aplos Software site, and I tend to agree with you. what I like about the site is it's simplicity, which is good for an organization promising to "Simplify Your Nonprofit's Finances" The color scheme is basic, there's a big strong call to action (Get Started for Free). That said, there's very few options for the end user to interact with the site.  And links at the footer are not going to make that happen. To answer your three questions: 1. What are your thoughts about including a drop down menu in the header for the different products? My answer: I think this would be really helpful.  If the end user is on a long, long page, a footer navigation is helpful to allow someone to quickly click to something else on the site.  But it's not how most main stream users navigate a site.  The top navigation is... 2. They have a good blog with content that gets regularly updated. Currently it's linked in the footer and gets a tiny amount of visits. What are your thoughts about including it as a link in the header instead? My answer: Yes!  You should add the blog to the top of the site.  The blog is a way that end users can see a little bit under the covers of an organization, as it's not the content that's been through so many rounds of revisions that it's 100% polished.  The blog is a great place to feature new clients, awards, and other information.  If it's updated frequently, it also demonstrates that the company is active, and conveys the idea that the software product the company is selling is not static / done, but is continuously supported.  That may or may not be true, but a blog that is updated all the time will tend to convey that the company is still actively working on the product. 3. What are best practices with using (or not using) no follow with site navigation and footer links? How about with links to social media pages like Facebook/Twitter? My answer: honestly, I wouldn't worry too much about using no-follow tags on your footer navigation.  At least on this site.  You don't have so many links that Google is going to ignore them.  Plus, it's at the end of the page anyway.  You certainly could, though, and it won't hurt in any way.  Just make sure that your site map, privacy policy, etc are followed links. Hope this helps! Jeff

    | customerparadigm.com
    0

  • I think you have your answer then on how you want to focus your URLs and your site!

    | CleverPhD
    0

  • Oliver, Thank you for pointing out this "exception" to the Schema placement rule.  As is the case with any structured markup solution, there will, from time to time, be cases where certain, specific elements go in the "head" section of the code.  Anything that applies to an individual page in its entirety, and does not limit itself to an element of content within the page does, in fact, belong in the "head" area of the page code.

    | AlanBleiweiss
    2

  • Hi Federico, I'm sorry but I don't think I agree that someone should nofollow links to themselves, particularly in the example given in the question by James. If you overdo it, you should just stop overdoing it rather than adding nofollow to the links. Google are fine with you cross linking between your subdomain and main domain as long as it makes sense for the user and isn't manipulative. Cheers. Paddy

    | Paddy_Moogan
    0

  • That's correct Keri! When determining whether something is a duplicate or not we will use the on-page information itself. Thanks! Joel.

    | JoelDay
    0

  • Thanks Takeshi and CleverPhd.  This is exactly the kind of feedback I was looking for.  Hiding the blank pages until there is content is actually pretty easy in this system so I will just go ahead and do that. Thanks again.

    | rayvensoft
    0

  • Just be sure to put in 301 redirects from site B to site A, so that Google doesn't think the blog content is duplicate content, and you should be fine.

    | TakeshiYoung
    0

  • You need to pull this forum question.  That link redirects to a spammy site about "Freeing Syria."

    | SearchDexSEO
    0

  • Google doesn't believe these links are natural ones because they represent no value for a website visitor. From what I know the advice is to make them nofollow if you still want to keep it in the footer of the websites. Personalty I would go with another solution - I would place a dofollow link on a single page like for example the "About" page if it contains details about the website also and not only the company or on an "Impressum" page if the website has one.

    | SorinaDascalu
    0

  • Hi Ruben, Agree with Alex. Provided you have unique content for each page, you should be okay.

    | MiriamEllis
    0

  • Hi, while I do not know why Yelp uses 303 but all that I can say is 303 does not pass the link juice and it is not recommended to be used in place of a 301 as 301 is the best way to pass on the link equity to the destination URL. Here you go for a small experiment that proves 303 redirection worthless. Hope this helps my friend. Best, Devanur Rafi

    | Devanur-Rafi
    0

  • I doubt it would have a negative impact, but I would make sure that the alt attribute accurately describes the image. The alt attribute is just a small factor in ranking, so I would not spend too much time on it Just make sure that you're using the most appropriate image: if you're re-using the same image to target different types of keywords, you might be using images that depict something too broad. You be the judge: if you think it's a good image for the user, I can't imagine it getting you into trouble.

    | Carson-Ward
    0

  • My initial reaction was that this is more likely technical than something Google is doing - checking the load-time is a good idea. Make sure the sitemap validates and there's nothing odd about it. If you manually re-submit it, does it seem to take?

    | Dr-Pete
    0

  • Hi Sorina, Thanks for the response. That makes sense as the content isn't completely duplicate.

    | anneoaks
    0

  • Depending on the CMS you are using, you may be able to add a dynamic element to the URL when more than one filter is being used, for example noidx=true. Then utilize your robots.txt file to disallow all URLs with noidx=true. This should allow pages with one filter to get indexed and crawled, but when a user (or Googlebot) enables another filter, that page would not be entered into the index.

    | TopFloor
    0

  • No problems. Be interested to hear what the problem was if you're happy to share?

    | DougRoberts
    0

  • Thanks for your help Jesse - will just have to work harder

    | Joseph-Vodafone
    0

  • Hi Peter, The generated html sitemap is generated once a day. So most of the time it is up to date. Even though it does generate some 404's when products from the webshop are removed earlier in the day. I will advice the webmaster to look for an XML sitemap generator for magento. Thanks Peter for your attention and advice. Much appreciated Auke

    | auke1810
    0