Hi Moz Community,
Are Bing/Yahoo crawlers different from Google’s crawler in terms of how they process client side JavaScript and especially content/data loaded by client side JavaScript?
Thanks,
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Moz Community,
Are Bing/Yahoo crawlers different from Google’s crawler in terms of how they process client side JavaScript and especially content/data loaded by client side JavaScript?
Thanks,
Hi Moz Community,
I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction?
Thanks,
Hi Moz Community,
Is there a proper way to do SPA (client side rendered) and PWA without having a negative impact on SEO? Our dev team is currently trying to covert most of our pages to Angular single page application client side rendered. I told them we should use a prerendering service for users that have JS disabled or use server side rendering instead since this would ensure that most web crawlers would be able to render and index all the content on our pages even with all the heavy JS use. Is there an even better way to do this or some best practices?
In terms of the PWA that they want to add along with changing the pages to SPA, I told them this is pretty much separate from SPA's because they are not dependent. Adding a manifest and service worker to our site would just be an enhancement. Also, if we do complete PWA with JS for populating content/data within the shell, meaning not just the header and footer, making the body a template with dynamic JS as well would that effect our SEO in any way, any best practices here as well?
Thanks!
Hi Moz community,
Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works.
I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed.
https://sitebulb.com/resources/guides/javascript-seo-resources/
However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index.
Any thoughts on this, is this concern valid?
Thanks!
Thanks for your help on this Nigel
Hey Nigel,
These parameters are already in my search console but Moz is still picking them up as duplicates.
Hi Logan,
I've seen your responses on several threads now on pagination and they are spot on so I wanted to ask you my question. We're an eCommerce site and we're using the rel=next and rel=prev tags to avoid duplicate content issues. We've gotten rid of a lot of duplicate issues in the past this way but we recently changed our site. We now have the option to view 60 or 180 items at a time on a landing page which is causing more duplicate content issues.
For example, when page 2 of the 180 item view is similar to page 4 of the 60 item view. (URL examples below) Each view version has their own rel=next and prev tags. Wondering what we can do to get rid of this issue besides just getting rid of the 180 and 60 item view option.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks!
Hi Nigel,
Thanks for the response and the post, I've actually read the article before and used the rel=next and rel=prev to fix some duplicate content issues because of pagination in the past.
Right now, the rel=next and rel=prev is not solving my duplication problems because pagination isn't the issue so the speak. The duplication is occurring because i have two page types (one at view 60 items and one at view 180 items - kind of like a filter) Each view (60 & 180) has their own set of pagination rules but it looks like page 4 of the 60 view is a duplicate of page 2 of the 180 view if that makes sense.
It becomes really tricky here to try and find a solution.
Hi Moz Community,
We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks
Hi Anthony,
Thanks for that response, that makes a lot of sense.
Best,
Zack
Hi Anthony,
Thanks for your reply, we have a very high turn over rate for products and many items go out of stock frequently. Would you recommend just noindexing the out of stock pages that don't have any traffic or links and are very old so they don't waste crawl budget? Normally it takes a while for Google to take 404 or 410 pages out of their index especially if the pages are old and don't get crawled very often.
Thanks
Zack
Hi Moz Community!
We're doing an audit of our e-commerce site at the moment and have noticed a lot of 404 errors coming from out of stock/discontinued product pages that we've kept 200 in the past. We kept these and added links on them for categories or products that are similar to the discontinued items but many other links of the page like images, blog posts, and even breadcrumbs have broken or are no longer valid causing lots of additional 404s.
If the product has been discontinued for a long time and gets no traffic and has no link equity would you recommend adding a noindex robots tag on these pages so we're not wasting time fixing all the broken links on these?
Any thoughts?Thanks
Hi Mozers,
I want to keep the HTTP xml sitemape live on my http site to keep track of indexation during the HTTPS migration. I'm not sure if this is doable since once our tech. team forces the redirects every http page will become https.
Any ideas? Thanks
Hi Christian,
Thanks for the reply. HTTPS rel canonical were added to live pages, as I expected this is why some are showing up in the search results. It's a problem through for GA and Search console tracking since we haven't made the switch server side yet and currently http pages don't redirect to their https version yet. So we're seeing no sessions for our http versions.
If I change the rel=canonical back to http on the live site I'm guessing the non secure pages will show up again after being crawled?
Thanks!
Hi Moz Community,
Recently our tech team has been taking steps to switch our site from http to https. The tech team has looked at all SEO redirect requirements and we're confident about this switch, we're not planning to roll anything out until a month from now.
However, I recently noticed a few https versions of our landing pages showing up in search. We haven't pushed any changes out to production yet so this shouldn't be happening. Not all of the landing pages are https, only a select few and I can't see a pattern. This is messing up our GA and Search Console tracking since we haven't fully set up https tracking yet because we were not expecting some of these pages to change.
HTTPS has always been supported on our site but never indexed so it's never shown up in the search results. I looked at our current site and it looks like landing page canonicals are already pointing to their https version, this may be the problem.
Anyone have any other ideas?
Does anyone have insight into the session percentage lift for their blog or their site after making the move from a subdomain blog to a subfolder? I'm seeing a lot of people talk about improvements in rankings for keywords on their blog and site but haven't seen anyone list out session numbers to go with that data.
Thanks
Currently we have two versions of a category page on our site (listed below)
Version A: www.example.com/category
• lives only in the SERPS but does not live on our site navigation
• has links
• user experience is not the best
Version B: www.example.com/category?view=all
• lives in our site navigation
• has a rel=canonical to version A
• very few links and doesn’t appear in the SERPS
• user experience is better than version A
Because the user experience of version B is better than version A I want to take out the rel=canonical in version B to version A and instead put a rel=canonical to version B in version A. If I do this will version B show up in the SERPS eventually and replace version A? If so, how long do you think this would take? Will this essentially pass page rank from version A to version B
Hi all,
I've been looking around at some of our competitors websites and I've been noticing huge amounts of keyword stuffing throughout the pages and also grouped within the bottom of the page. From what I've been taught it's not a good thing to do and you can be penalized for it. What's anyone else's take on keyword stuffing and how it's looked upon in 2017? Is there a max amount of keywords you should have on your page?
Here are a few URL's to the websites I'm talking about and their webpage.
https://www.walmart.com/cp/personalized-gifts/133224 - Keyword stuffing in the bottom group text for the word "personalized"
http://www.personalcreations.com/unique-groomsmen-gifts-pgrmsmn - Keyword stuffing in bottom group text for "groomsmen"
http://www.groovygroomsmengifts.com/ - keyword stuffing throughout page for "groomsmen"
Hi Andy,
Thanks for the quick reply, we did not get any errors or warning from our search console when this was implemented. We added the star ratings markup to our product pages back in early Nov or late Oct of 2016. We also have price and availability markup on our product pages.
We did take a look at our one of our product pages using the testing tool and everything seems to be fine. I've heard from a few others that it's really up to Google whether or not they choose to show the snippet features in search but wanted to know if anyone had any advice.