Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Content change and variations in ranking
Well is there way more competition for the keyword or keywords you are optimizing this old page for than for the keyword(s) the new page is optimized for? We would say it's common to not necessarily see changes when making changes to content. For every query, Google has several hundred thousand pages (at least) it could return in the SERPs. Sometimes content tweaks alone isn't enough to move the needle.
| Nozzle1 -
Duplicate content on ecommerce sites
With the caveat that this is a 7-yo thread -- I'd say that it's generally more of a filter these days (vs. a Capital-P penalty). The OEM or large resellers are almost always going to win these battles, and you'll be at a disadvantage if you duplicate their product descriptions word-for-word. Can you still rank? Sure, but you're going to have an easier time if you can add some original value. If you aren't allowed to modify the info, is there anything you can add to it -- custom reviews (not from users, but say an editorial-style review), for example? You don't have to do it for thousands of products. You could start with ten or 25 top sellers and see how things go.
| Dr-Pete0 -
Too many SEO changes needed on a page. Create a new page?
Firstly see what Google thinks of the page. On-page SEO checkers require wider experience to use accurately and efficiently. You can't just take insights from online tools and run with them If the page has no associated keywords / search queries (in Ahrefs, GSC, Google Analytics, SEMRush etc) then that would show Google isn't really that interested. Because keyword data (even from Google) is always heavily sampled, you'd also want to check if, as a 'landing page' (in Google Analytics), the page had been receiving any traffic from Google (Organic Search segment) If the page is doing well despite what online tools say, rules be damned. If the page isn't performing well (or at all) then just re-build it from scratch in line with best practices, and 301 redirect the old URL to the new one. If the URL stays the same then just rebuild on the active URL, that's fine too (but don't publish until the complete re-build is 100% finished and you are 110% happy with it)
| effectdigital1 -
Removing the Trailing Slash in Magento
You could always force trailing slashes instead of removing all trailing slashes. What you really want to establish, is which structure has been linked to more often (internally and externally). A 301 redirect, even a deeper more complex rule - is seldom the answer in isolation. What are you going to do (for example) when you implement this, then you realise most of the internal links use the opposite structure to the one which you picked, and then all your internal redirects get pushed through 301s and your page-speed scores go down? What you have to do is crawl the site now, in advance - and work out the internal structure. Spend a lot of time on it, days if you have to, get to grips with the nuts and bolts of it. Figure out which structure most internal/external links utilise and then support it Likely you will need a more complex rule than 'force all' or 'strip all' trailing slashes. It may be the case that most pages contain child URLs or sub-pages, so you decide to force the railing slash (as traditionally that denotes further layers underneath). But then you'll realise you have embedded images in some pages with URLs ending in ".jpg" or ".png". With those, they're files (hence the file extension at the end of the URL) so with those you'd usually want to strip the slash instead of forcing it At that point you'd have to write something that said, force trailing slash unless the URL ends with a file extension, in which case always remove the slash (or similar) Picking the right structural format for any site usually takes a while and involves quite a bit of research. It's a variable answer, depending upon the build of the site in question - and how it has been linked to externally, from across the web I certainly think, that too many people use the canonical tag as a 'cop out' for not creating a unified, strong, powerful on-site architecture. I would say do stick with the 301s and consolidate your site architecture, but do some crawling and backlink audits - really do it properly, instead of just taking someone's 'one-liner' answer online. Here at Moz Q&A, there are a lot of people who really know their stuff! But there's no substitute for your own research and data If you're aiming for a specific architecture and have been told it could break the site, ask why. Try and get exceptions worked into your recommendations which flip the opposite way - i.e: "always strip the trailing slash, except in X situation where it would break the site. In X situation always force the trailing slash instead" Your ultimate aim is to make each page accessible from just one URL (except where parameters come into play, that's another kettle of fish to be handled separately). You don't have to have EVERYTHING on the site one way or the other in 'absolute' terms. If some URLs have to force trailing slash whilst others remove it, fine. The point is to get them all locked down to one accessible format, but you can have varied controlled architectures inside of one website
| effectdigital0 -
Will I loose from SEO if I rename my urls to be more keyword friendly?
I will check those guides and see how I can work on optimizations. Thank you.
| Spiros.im0 -
If a the site doesn't have a true folder structure, does having subdirectories really help with hierarchy and passage of equity?
Yes. Link equity or PageRank is passed from page to page via the crawlable links on your site.
| Nozzle1 -
Link Structure June 2019
We agree with Gaston. Our experience is that it almost is never worth the time to change URL structures on an established site. Instead, focus time and effort on creating more and better content, adding more internal links, and doing legit link building. For example, we looked at this blog post, https://www.fishingtackleshop.com.au/blog/chasebaits-lure-range/, where you highlight four separate lures but you only link to one of the product pages instead of all four. Furthermore, you linked only once using a "Shop Now" button. We'd recommend adding another link to that product page linking the text of the name of the product somewhere above the "Shop Now" button within the content talking about that specific lure.
| Nozzle1 -
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
We like to think of all pages written around a specific topic as a content silo. Many of these pages will include the same keywords for sure. The key is to choose which page is the "head" of the silo and should rank for the main phrases assigned to that silo. Then you can use all the other pages in the silo to internally link back to the main page with the proper anchor text, thereby helping the main page (and correct page) rank for the keyword. To sum up, you might end up with many pages that all include a specific keyword but you're going to internally link all of them to the main page using the keyword as the anchor text which is basically telling Google that all your pages are saying that the main page is the most relevant for that keyword.
| Nozzle1 -
How to solve JavaScript paginated content for SEO
thanks for the thorough response. I was leaning toward leaving it alone for the time being and this helps affirm my decision. I don't think we are going to see much benefit from tampering with it to make it more Googlebot-friendly
| MJTrevens1 -
UK company not ranking .com domain in UK
Darn. If you'd be willing to share the domain I'd be willing to do a full crawl of it and return the data to you. If there are no glaring technical errors then at least we'd know it was probably a combo of content issues and off-site popularity metrics which could be corrected over time Up to you though. Seems complex though, like we should begin ruling stuff out
| effectdigital1 -
Trying to get Google to stop indexing an old site!
No worries, let us know if it changes anything.
| Martijn_Scheijbeler0 -
What is the impact of having two 302 redirects?
This is right. The 302 keeps the SEO authority on the URL, though if you take the piss with it (leave 302s up for months on end) then they will go bad and degrade (meaning that even when you remove the 302, it won't restore SEO authority to the old pages) Since the first 302 retains the SEO authority on the first URL, there's no SEO authority for the second 302 to mess up or cause problems with. It's bad UX but little else in reality If OP is planning to redirect permanently and leave the 302 up forever, then obviously that will kill the SEO authority of the old page (dead) as it will retain for a while on the old URL before 'dying' OP may wish to consider whether the old page will be coming back or not and how long that might take. If it's going to be gone for many months (closing on a year) or multiple years, or is never coming back, alter the 302s to a single 301. If the page will be back in a few days, it's fine as it is just don't forget to remove the 302s later
| effectdigital1 -
What to do with dynamically translated content sharing same urls?
Yes, it is the main non-recommended architectural layout by Google Google's official guidance is here: https://support.google.com/webmasters/answer/182192?hl=en ... scroll down until you find a table. The closest architecture to your described architecture, is the parameter-based one which Google explicitly does not recommend. If Google wouldn't recommend that; without even the shallow parameter-based signifiers, there's little hope that anything would go well I have seen a lot of sites that try to serve different language content from the same URLs and they very rarely do well or perform at all in modern times Read Google's advice and pick from one of their pre-defined, recommended options. Whilst translation plugins can be cheap and useful, they're usually awful for SEO
| effectdigital0 -
Filtering Views in GA
If you want to see it in the screenshot you gave me, I would imagine you would need a specific Google Analytics for the TLD. If that's not possible, you can also try to check out this article which I found (It's not by me, but when I searched your query this is what came out). https://blog.achille.name/web-analyitics-en/google-analytics-filter-extracting-tlds-keywords-adwords/#.XTreYRJKiL8 Aside from that, I'm not sure what else could work if you want to see how many views per TLD. If you want a full dashboard on your TLD, the method above might work although it is dated.
| CJolicoeur0 -
Captcha wall to access content and cloaking sanction
In general, Google cares only about cloaking in the sense of treating their crawler differently to human visitors - it's not a problem to treat them differently to other crawlers. So: if you are tracking the "2 pages visited" using cookies (which I assume you must be? there is no other reliable way to know the 2nd request is from the same user without cookies?) then you can treat googlebot exactly the same as human users - every request is stateless (without cookies) and so googlebot will be able to crawl. You can then treat non-googlebot scrapers more strictly, and rate limit / throttle / deny them as you wish. I think that if real human users get at least one "free" visit, then you are probably OK - but you may want to consider not showing the recaptcha to real human users coming from google (but you could find yourself in an arms race with the scrapers pretending to be human visitors from google). In general, I would expect that if it's a recaptcha ("prove you are human") step rather than a paywall / registration wall, you will likely be OK in the situation where: Googlebot is never shown the recaptcha Other scrapers are aggressively blocked Human visitors get at least one page without a recaptcha wall Human visitors can visit more pages after completing a recaptcha (but without paying / registering) Hope that all helps. Good luck!
| willcritchlow1 -
How to get back links with higher rank ?
In addition to Jose response, good old Blogger outreach. Come up with a gameplan focused on what subject matter, keywords & DAs you're after, them either you approach the bloggers yourself, or you pay a 3rd party to procure the links for you. The later is the most common approach as link building can be very timeconsuming. As with most thing SEO, Moz has an article on this matter https://moz.com/blog/blogger-outreach-for-your-clients. Also, if you have a Moz Pro account & campaigns set up & running then there is the Opportunities feature accessible via the campi=aign dashboard.
| jasongmcmahon1 -
Disallowed "Search" results with robots.txt and Sessions dropped
If you have a site with, at least 30k URLs, looking at only 300 keywords won't reflect the general status of the whole site. If you are looking for a 10% loss in traffic, I'd start by chasing the pages that lost more traffic, then analyzing whether they lost rankings or if there are some other issues. Another way to find where there is traffic loss is in search Console, looking at keywords that aren't in the top300. There might be a lot to analyze. It's not a big deal having a lot of pages blocked in robots.txt when what's blocked is correctly blocked. Keep in mind that GSC will flag those pages with warnings as they were previously indexed and now are blocked. That's just how they've set up flags. Hope it helps. Best luck. Gaston
| GastonRiera0