Questions
-
Bing & JavaScript?
What I can count and have contrasted in my web, is that the Javascript in Google is processed differently. Google crawls javascript and also the url inside, while Bing does not. I think I'm a bit short to answer your question, but that's what I've checked.
Technical SEO Issues | | martinxm0 -
Personalized Content Vs. Cloaking
It sounds like you're on the right track. If users and bots start off with the same content, that's a good start. From there, the question is "how much content is being customized, and how frequently?" For example, if you're swapping out 5 different headlines for 40% of users, and 60% of users see the original, that's not a big deal, particularly if the rest of the page is the same. But if you're swapping out 80% of page copy (eg removing a bunch of excess copy that is shown for SEO purposes), and 60-90% of users are seeing that "light" version of the page, you run the risk of two things: First, the chance that it wouldn't pass a manual review if one was performed. Second, the chance that Google may render a copy of the page as a user (not announcing themselves as a crawler), seeing a different version of the page multiple times, and then effectively devaluing the missing content, or worse, flagging the page in their system as cloaked content. We could get lost in details of whether or not they're doing this, or how they're doing this, but from a technology standpoint it's pretty simply for them to render content from non-official IPs and user-agents and do an 'honesty check' for situations where content is showing up multiple ways. This is already how them compare the page on desktop vs mobile to see which sections of the page render, and which are changed. I think you are also right to rely on site interaction before personalizing, but since there are multiple ways to do that, you should know that it's possible for Google to simulate some of those interactions. So there's a chance at some point they will render your content in a personalized manner, particularly if personalization is the result of visiting a URL or clicking a simple toggle switch or button.
Technical SEO Issues | | KaneJamison0 -
What's the best way to test Angular JS heavy page for SEO?
Hi Zack, I think your concern here is valid (your render with Screaming Frog or any other client is unlikely to be precisely representative of what Googlebot will see/index). That said, I'm not sure there's much you can do to eliminate this knowledge gap for your QA process. For instance, while we have seen Googlebot timing out JS rendering around the ~5s mark using the "Fetch & Render as Googlebot" functionality in Search Console (see slide 25 of Max Prin's slide deck here), there's no confirmation this time limit represents Googlebot's behavior in the wild. Additionally, we know that Googlebot crawls with limited JS support - for instance, when a script uses JS to generate a random number, my colleague Tom Anthony found that Googlebot's random() JS function is deterministic (returns a predictable set) - so it's clear they have modified the headless version of Chrome they use to conserve computational expenses in this way. We can only assume they've taken other steps to save computing costs. This isn't baked-into Screaming Frog or any other crawling tool. We have seen that with a 5s timeout set in Screaming Frog, the rendered result is pretty close to what "Fetch & Render as Googlebot" functionality demonstrates. And with the ubiquity of JS-driven content on the web today, provided links and content are rendered into the DOM fairly quickly (well ahead of that 5s mark), we've seen Google rendering and indexing JS content fairly reliable. The ideal would be for your dev team to code these pages to degrade gracefully - so that even with JS support totally disabled, navigation and content elements are still rendered (they should be delivered in the page source, then enhanced with JS, if possible). Failing that, the best you're likely to achieve here is reasonable confident that Googlebot can crawl, render and index these pages - there'll be some risk when you publish them to production. Hope this helps somewhat - best of luck! Thanks, Mike
Technical SEO Issues | | MikeTek0 -
Links On Out Of Stock Product Pages Causing 404
Hi Anthony, Thanks for that response, that makes a lot of sense. Best, Zack
Technical SEO Issues | | znotes0 -
Can you keep you old HTTP xml sitemape when moving to HTTPS site wide?
Hi Zack! Migrating your site to HTTPS all your URLs will turn into HTTPS. So, there will no need to keep the old sitemap alive or keep track of the http indexation. Of course you must keep track of the indexation of the new site. Remember to create a new Search Console profile for that. Here, an excellent article and a checklist on everthing you should do in a HTTPS migration. The HTTP to HTTPs Migration Checklist in Google Docs to Share, Copy & Download, from Aleyda Solis. Hope this helped you. Best luck. GR.
Technical SEO Issues | | GastonRiera0 -
Landing pages showing up as HTTPS when we haven't made the switch
What I would do is the following: change the rel canonical back, remove the https version from Search Console (you need to add the https version of the website as well in Search Console) and then fetch and reindex the http version (also from Search Console). So basically, help Google understand this mistake and go back to the http version. Also, check your sitemaps and be sure that you are not including https links there. Hope this helps.
Technical SEO Issues | | iugac0 -
Rel=canonical on landing page question
Hi there Alot of this sounds off to me. First, I'd think you'd want /category living in the navigation, be indexed, have links, and have a great user experience. In my mind, www.example.com/category?view=all should only exist as a filtering URL when you change the number of URLs you want to see on the page itself. You'll have substantially more luck focusing on version A in my opinion. Focus on creating a great user experience and optimization strategy, and you should reap the benefits at a deeper level. Let me know if this helps! Good luck! Patrick
Technical SEO Issues | | PatrickDelehanty0 -
Keyword Stuffing
I almost didn't go look at that Walmart page because I thought that they were above keyword stuffing for SEO purposes. However, I did go look at was surprised. I my opinion, WalMart has added a few hundred words of total yada yada yada text that is designed for nothing other than search engines. They probably know that humans are not going to read it and they floated it so low on the page that nobody is going to see it. So, without knowing who did this and their intent, my guess is that this text was created for nothing else other than SEO purposes. They want some paragraph text on the page and have done a fine stuffing job on it. Overstuffed in my opinion, but done in a way that search engines are probably not going to be concerned about it because it is in natural language. This stuff might have positive ROI.
White Hat / Black Hat SEO | | EGOL1 -
Rich snippets not showing up in Google
It is up to Google Zack and there is never a guarantee that they will show, but you would like to think that they would be adopted at some point. What I would also suggest, is having a look on the Google forums for help as well. -Andy
Reviews and Ratings | | Andy.Drinkwater0 -
SERP cannibalization
Zack, A change like this on a money term ranking page is always going to incur some risk. But here's what I think after your update. /gifts/birthday-gifts ---> Better user engagement metrics, higher Page Authority, closer to the root.This curated landing page is best for a user who doesn't know which gift to buy, or even which type, but knows who they are buying it for a her or a him. I would say this describes an good (though it could be better) experience for someone searching for a short-tail, generic phrase like "Birthday Gifts" on Google. /gifts/birthday-gifts/birthday-gifts This landing page is more like a typical category page with filters and facets to narrow down the search by gender, price and other features. They aren't the same page right now so I wouldn't use the Rel Canonical tag as a way to consolidate them. The View All category page is good for crawling, but the Curated Landing Page is better for users. I would come up with a layout that combines the best of both pages and test that option against a percentage of the traffic to the /gifts/birthday-gifts. Basically create a B version of that page, which includes the filtering options available on the category page. These options can affect what shows up in the first carousel for "all birthday gifts" while the curated sections remain on the page for easy self-select (i.e. For Him / For Her). Assuming that page converts at least as well as the existing A version, I would give serious thought to combining these two pages to see if you can get your highest performing landing page into the #1 spot. Given the relatively lower authority of the existing #1 spot, I think this is definitely doable. You do risk losing some traffic that would have been more inclined to click on your listing after seeing more than one in the SERPs, but you can offset that with PPC, as you're doing now. Collect your baselines first. Combine the totals from both pages to see if the consolidation results in more sales.
Intermediate & Advanced SEO | | Everett0 -
Drop in Indexed Page + Organic Traffic
Oops, I missed that part. Have you checked Google Search Console to see if someone set any URL parameters? The first thing I would do is determine how many pages actually should be indexed to see if there's a large discrepancy between that and the number Google shows. A crawler like Screaming Frog can help with this. If you export the crawl to Excel, you can easily remove duplicates in the canonical URL column and filter out the noindexed pages. If you find there's no real discrepancy, Google may have simply been cleaning house of some really old links in the index that hadn't been crawled in a while. Beyond that, if you can pinpoint any specific URLs that have been deindexed, use the "Fetch as Google" tool to help diagnose or post it here so the community can take a look.
Technical SEO Issues | | LauraSultan0 -
Duplicate Landing Pages showing up in search results
Hi Zack, I'm not sure if it is caused by the sitemap.xml, but the old URL is still in the sitemap.xml. You always should try to avoid having redirected URL's and error pages in the sitemap.xml. You can try to remove the URL from the sitemap and resubmit the sitemap in Google Search Console. André
Technical SEO Issues | | ConclusionDigital0 -
Do you need a canonical tag for search and filter pages?
Definitely, agree with Robert. You do not need rel=canonical tags on filtered / search views of your content. After you remove these rel=canonical's, I'd suggest running your site through Screaming Frog's rel=canonical error report to confirm that the rel=canonical issues are fixed. Hope that helps, B
Technical SEO Issues | | BritneyMuller0 -
Google crawling item page reviews
Thanks for being so thorough with your answer! I'll have our tech team take a look at the markup. We used to have our review content embedded as part of the html code and I've heard that increases crawl frequency and was also easy for search engines to understand. Now I think we might be using AJAX which apparently causes confusion for crawl bots.
Reviews and Ratings | | znotes0 -
Tricky Duplicate Content Issue
Hi, Your 'page 1' URLs should canonicalize to the root level of that page. And your rel=prev tag for page 2 should point to the root, but beyond that, it's standard implementation. Beyond that, the pagination markup for rel=prev/next is pretty simple.
Technical SEO Issues | | LoganRay0 -
Implementing AMP pages on WordPress blog
Hi Zack, Great question. First of all, I haven't gotten to play with the official AMP plugin for Wordpress yet, but I've been researching and reading about the implementation of AMP pages within Wordpress for a while now. First of all — it's recommended that you utilize AMP pages if you are a publisher or push out content often. In other words, AMP is not ideal (yet) for ecommerce, casual bloggers, etc. You can definitely try it out, but AMP was specifically designed for publishers that churn out a lot of content. To answer your question regarding duplicate content — From my understanding, the proper way of implementing AMP pages and avoiding duplicate content is to put a canonical tag on your main page that points to the AMP page (this way Google sees that you have an AMP version of this page), as well as adding a canonical tag on the AMP page that points to the regular page. In other words, you would have something like this. Main Page AMP Page Here's Google's take on this. i'm assuming that the plugin will automatically add the correct code to the pages, but I have not tested it so I wouldn't know for sure. Please do let us know if you end up testing this out! Also, you might want to check out Yoast's "Glue" plugin for editing and adjusting AMP pages. It seems like it could be helpful. Cheers
Intermediate & Advanced SEO | | sergeystefoglo0