Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • It depends on the customer experience you want and how many products you have to show? There are pros and cons to both of these options: Load more keeps the user on the same page for longer, allowing them to scroll through the options easily without jumping around too many pages. This should increase engagement to let people see everything in one go. BUT, unless you recall these via ajax or something on demand, then 'hiding' these from the user until they click the see more can slow page speed as you're obviously storing all of that data on the page ready to be loaded. Pagination forces people to flick through different pages and would need rel=next/prev set up and canonicalised back to page1, else you may be seen as having duplicate content for meta titles etc unless you can change these dynamically, or set different pages to be no indexed. It can put some people off as can make someone feel like they have to sift through a LOT of pages to find what they're looking for if you have multiple pages. There's a useful study on this here: https://www.smashingmagazine.com/2016/03/pagination-infinite-scrolling-load-more-buttons/ 

    | Kelly_Edwards
    1

  • Hey Alex–building from David's note. Without more information (like your site), it's hard to see exactly what's going on, but a few things come to mind: Are you using hreflang tags? If you're not, the 302 redirects may be a workaround that a content editor or dev put in place to get users to their appropriate location on your site. In addition to resolving these redirects, I highly recommend you implement hreflang tags in the of your site's pages so that crawlers know that these different international TLDs are all related. You should definitely remove these redirects, especially if there are versions of these URLs that exist with 200 status codes. There may be a rule in your site's htaccess file (you'll need to talk to your developer and/or server manager to update this), or within meta directives in the tags of your pages.

    | zeehj
    0

  • Most likely after next Mozscape Index. If the 301 is done correctly and PA is not reflected in the short term, I wouldn't worry that much. Make sure to monitor organic positions and traffic to those landing pages for any fluctuations.

    | KevinBudzynski
    0

  • Hi Zack, Have you configured your parameters in Search Console? Looks like you've got your prev/next tags nailed down, so there's not much else you need to do. It's evident to search engines that these types of dupes are not spammy in nature, so you're not running a risk of getting dinged.

    | LoganRay
    0

  • Thanks this is great. I had seen this in my quest for information but I would still like examples of sites where json ld has been implement well.  I am trying to get things right first time with all the bells a few whistles and may be a trumpet as well.  It strikes me you can pretty clever in the way this is implemented and articles do not really give the same insight as well crafted in page code.

    | Eff-Commerce
    0

  • Hi hoosteeno, It's obviously tricky to diagnose specific technical issues in the abstract, without looking at the specific site, but here are some resources and ideas that might help: Rand recently talked about ranking fluctuations on whiteboard Friday - that'll be worth checking out Major site redesigns can cause quite a bit of fluctuation, but you don't want to wait around to find out weeks or months later that there is a technical issue, so I would suggest working through this checklist It's worth checking whether you have accidentally caused a load of new on-site duplicate content with the recent work - that's one other thing that could cause weird in / out fluctuations Hope something there helps. Good luck!

    | willcritchlow
    0

  • Hey there, If it's done correctly, you shouldn't see any ranking drops for longer period of time. Of course, there's always risk when you're playing with the redirecting but I'd follow the Google guidelines (as you correctly mentioned) and it should work smoothly. Let me know how it worked. Cheers, Martin

    | benesmartin
    0

  • Hello, As Martin suggested, it seems any option will require time and effort. Are you sure these pages are causing an issue with duplicate content? Are they ranking and pulling traffic/sales? I'm dubious these pages are having a negative impact. I would want to verify that as much as possible before expending the resource to canonical or take other action. That said, canonical would be the recommended solution here - and should take no more effort than the "noindex" tag. Best, Mike

    | MikeTek
    0

  • Dear Devanur, Thank you kindly for taking the time to respond to me here, with such and thorough and logical reply. I was naturally swayed over to your way of thinking, and your answer has helped cement my thoughts. The idea of running multiple sites all independently trying to climb the SEO ladder, versus one big site with all pages working in unison, looks to be a chicken and egg conundrum at face value. The trip wire was the hygiene of the URL, and the clarity for users who might happen to notice a generic URL which isn’t immediately or easily relatable to the brand vertical. With this comes the need to stamp authority on that generic name, which in fact is the same issue for the vertical brand which would have little to no brand recognition anyway if both were starting from scratch. So your answer here really is build a sustainable generic universal domain name, infuse it with authority to the end user, and reap the rewards of SEO, page rank, back links all drawing back to verticals that are optimised for their niche at the same time. This makes perfect sense, and is a smart answer So agin, you’re thought process was most helpful here, and has certainly helped me to feel greater confidence in this strategy. Thank you for your help again, and for the time to give me such great advice which is very greatly appreciated here. Warm Regards, Chris

    | RaynChris
    0

  • I try to stay relevant to the tag of the article or use the Title of the article. It is not about stuffing keywords in places. It is about relevancy. Here is a cornerstone article: https://www.brightvessel.com/21-tips-woocommerce-website-design One of my supporting article: https://www.brightvessel.com/multiplying-e-commerce-site-conversions/ At the bottom: I added a link. If you like this post, check out 21 Tips for WooCommerce Website Design For me, I like things to be understandable and relevant to each article I link to, which works. I would use the mindset that your providing information and how you link that information needs to display that way.

    | brightvessel
    0

  • Hi there, Thanks for the question. Can I just check something regarding this bit from your question: "My company currently owns five different websites and every day we download a list of for Google crawl errors.I then crawl the downloaded list with screaming frog to double check the redirects to make sure the pages are not 404s." I'm a bit confused because you say you download a list of crawl errors, which would be 404s, but you then crawl them to make sure they're not 404s? Could you clarify this bit? Thanks! Paddy

    | Paddy_Moogan
    0

  • Hi! Potentially you can, but it probably would be better to implement it in your current sitemap structure. As it will give you a better flow of working with this. Martijn.

    | Martijn_Scheijbeler
    0

  • there always a lot of different opinions Yes, indeed.   There will always be lots of opinions. If you hire a person to review what you do then you will get ONE opinion.  In my opinion, a lot of the opinions posted online either in articles or in forums are flawed (including a lot of what I post).  Nobody knows everything and a lot of people sell SEO services who should not be selling SEO services.  Be careful. My method is to depend mainly on myself, then when I am going to do something really important, I get a paid opinion.  I don't use forums as  a substitute for opinions on things that are more than a simple one-topic question/answer.  People who are posting in forums generally are not going to spend hours doing a deep review your site and making complex decisions.   You will not get good advice on a topic that requires a deep and detailed knowledge of your website and your business.  People posting in a forum generally don't have that kind of extra time and we should not expect that type of service in a forum.    I also believe that this type of information should remain private and not be discussed in the open in a forum. How to find someone?   My suggestion is to look at a Q&A forum like Moz daily and read lots of the questions.  After doing that for a several months you will learn a lot.  You will also see the range of people who give answers.  Some might not seem like a person you would want giving you advice, others might be much more appealing because of their ability to explain clearly, their generosity and how you perceive their knowledge of what they are talking about. Every person who has given me advice or done SEO or technical work for my website me is a person who I have met in an SEO forum.  I go to different people for different types of advice and I often get two opinions on important projects.  I am not telling you this because I am looking for work.  I never give private consultations or do SEO work for anyone.  I also never recommend people to work on the websites of others.  These are things that you should decide for yourself. Get educated about SEO.  Learn who gives good advice.

    | EGOL
    1

  • If the content is the same, but translated, without the hreflang tag, it is duplicate content. The hreflang tag is there to serve as a signal that you know the content is the same but translated and you want Google to know that as well. This situation is exactly what hreflang is made for.

    | katemorris
    0

  • Hi! Have you no indexed the pages too? That may help to make sure that they aren't being crawled if that's concerning you. May at least give Google another signal not to crawl those pages. Obviously it's not a catch all as there's only so much you can do to tell Google not to crawl a page. Sometimes if the alternative page is linked to internally (which it sounds like it is), then it will automatically crawl it even though you've said it has a canonical on it as you're showing that the page is important to your site. May be worth testing a few pages to see if it has an impact.

    | Kelly_Edwards
    0

  • Off the top of my head, some things to check: noindex tags deployed from staging during the redesign broken rel=canonical / hreflang update to robots.txt broken status codes (are those pages still 200 OK status?) site speed - has the performance of the site as a whole taken a hit? It needn't necessarily be about those pages specifically, but could be about the site health as a whole changes to XML sitemaps Some other harder to check / track down causes: changes to internal link architecture removal of pages that were the target of powerful inbound links Hope something there helps - good luck!

    | willcritchlow
    0

  • Best of luck! Definitely interested in hearing back about how it goes.

    | Dr-Pete
    0

  • Hi Lawrence, I do think RickRoll's advice was good, and the first thing I would say is that you should leave those 301 redirects in place, on a page-to-page level, for a good long while - don't change them again. It looks like there are still some pages from boston.renthop.com in the index, which indicates that Google may not fully understand that these pages have all moved. There are only a few still in there, though, so I don't think that's the only issue. One thing that can happen when pages 404 for a while is that even if you 301 them later, the link equity of the original pages can disappear (it doesn't always, but it can). So you probably want to spend some time building links to these pages/site sections, and where you can, I also recommend reaching out to sites that linked to the old subdomains and asking them to update the links. I'd recommend trying to get new links from local-specific sites (i.e. sites about Boston or Chicago, vs. nationwide/non-local sites about apartment hunting), and here's why: In my experience, Google usually treats "{keyword} in {location name}" searches differently depending on whether or not you're actually in the location you're searching for. If you're not in the location, it treats {location name} as a keyword, but if you are in the location, it treats {location name} as a location. So for example, if you search "apartments in Boston" and you're not in Boston, it will show you pages that rank for the term "apartments in Boston" - but if you are in the Boston area, it will show you pages that rank for the term "apartments" and that have strong local indicators that they are in Boston, or more specifically, are near to you. So since you rank nationally but not locally for this term, it sounds like the signals you need to focus on improving are local signals. You can do this by building links from local sites, by adding e.g. "Boston, MA" to title tags on individual apartment listings, and by building out some unique, locally-focused content like neighborhood guides to add value to searchers from that area. I hope that helps!

    | RuthBurrReedy
    0

  • Hi Garrett, I agree with Dave - getting some inbound links and other press will help. Bing and Yahoo tend to rate keywords being present in the domain/URL more strongly than Google does, which may be why you're seeing it come up there. It looks like https://feello.com is now ranking on page 1 for "feello" - an incognito search I just did saw it at position 8, under the image carousel. One thing you might want to do, if you haven't already, is use Organization markup in schema.org to try to improve Google's Knowledge Graph entry for the business. That will help Google understand that Feello is a branded term, and that that brand is associated with your website. You should also use the "sameAs" schema.org property to link to your LinkedIn, Twitter and Facebook pages, since Google is currently ranking those higher than your domain - that will help Google understand the relationships between those pages. Good luck!

    | RuthBurrReedy
    0