Questions
-
Canonical URL Change
Hi Pol, No, not just by changing the both the URL and canonical to something new and different. You'll need a redirect. Is there a reason you can't?
Technical SEO Issues | | DonnaDuncan0 -
Faceted Navigation URLs Best Practices
Hi there, If you want to provide these different sorting/filtering options, faceted navigation is going to be your best option. As I mentioned above, with this approach you need to utilize nofollow, noindex, robots.txt and canonical tags to communicate to search engines where duplicate content will occur. In the example you provided (masonry or list), you would want a single URL for both views otherwise you are presenting new opportunities for duplicate content. Happy to discuss further if you have any additional questions!
Intermediate & Advanced SEO | | Joe_Stoffel0 -
URL Structure on Category Pages
Thank you RedSweater, #1 In general, in our case same filters would apply for all / unique categories. So I assume we would use them all #2 For the issue about having same content on All Filtered vs Category page. Wouldn't this be solved with Canonical URLs? e.g. we currently have this URL: https://www.viatrading.com/wholesale/2/Electronics.html?hideSearchBox=false&keywords=&cid=2&facetNameValue=Category_value_+Electronics with the same content as: https://www.viatrading.com/wholesale/2/Electronics.html But both have Canonical for the 2nd. Just to make sure, is that SEO-friendly? #3 Isn't including keywords in URL a best practice though? See point 3 here: https://moz.com/blog/15-seo-best-practices-for-structuring-urls Also considering URLs wouldn't be too long, e.g. less than 100 characters See point 6 here: https://moz.com/blog/15-seo-best-practices-for-structuring-urls #4 I think that could work. Just to make sure, what is PDP? Cheers,
On-Page / Site Optimization | | viatrading10 -
Splitting a strong page - SEO
AS a rule of the thumb, no matter what changes applies on 3XX redirections, but the least you do it the best, not just for juice loss but for easier management of your website. You definitely do not want 2x or 3x 301 to happen unless it's really unavoidable based on how complicated your website is. Now your best bet depends on what you want to accomplish. In the past I always tried to be conservative and try not to lose too much of my so hardly earned traffic, and didn't want to lose a piece of it, but after a while you see consequences of that, as you start having a mixed composition of legacy URLs on your website. I would say, test in a relatively small section and see what happens. If your loss of traffic/rankings is too significative roll the changes back (don't forget the 301 back), and use your preferred method, but take into account that in the long run you want to have a manageable website limiting exceptions as much as possible. On a side note, people normally looks at 301 like a loss of value no matter what, but that's not always the case, the big deal with 301s is the loss of value accrued from other pages, so, if after you 301: change all the internal links so you don't have unnecessary internal 301s contact external websites to get the url changed. Once you do that, the 301 won't matter at all, as the resources sending value to that page are now linking to the new one. Hope that helped.
Intermediate & Advanced SEO | | mememax0 -
Merging Pages and SEO
It's hard to say how much traffic you'll lose from the merge. Like Logan said, you'll definitely lose a bit when you first move, but long term, you'll need to look at your competition to figure out if it's better to keep the pages separate or combine them. I don't recommend keeping pages A, B, and C if you're going to hide them from the main structure of your site. Pages get most of their Page Authority from internal links (unless they're link bait), so they won't be able to rank anyway. That said, here's how I'd estimate the loss of traffic from the move: Use Google Search Console to determine the primary keyword/s for page A, B, and C Use a tool like Open Site Explorer to determine the number of links A, B, and C have. (Bonus: look at the websites linking to A, B, or C. If those are resource pages, there's a good chance their webmaster will update their links to page D, which will help with the traffic dip. If they're from news articles, you'll probably have to rely on 301s.) Search for each of those top keywords and look at your competition. Does the competition closely target the term? Will page D seem as relevant to the keyword as A, B, or C did? Now, look at the Page Authorities of the competition for each keyword. Will page D, which will have a combo of links from A, B, and C, blow your competition out of the water? About match it? Still be a bit behind? Here's the part that's really tough: for each keyword, estimate where page D would rank, given how well it targets the keyword and how many inbound links it has. Estimate the % increase or drop in traffic based on adjusted click through rate. You can find this by playing around in Google Search Console to find a time when your site ranked in a different position, or by using average click through rates, like here. Once you're done, put together your estimated percent increases or drops in traffic to estimate how the new page will perform. (I recommend you look at a percent change because adding up totals only for top keywords won't take long tail keywords into account, and you'll almost definitely come up with a much lower count than you're currently getting.) Not the easiest process in the world, and your estimate will almost definitely be wrong, since you make a lot of assumptions along the way. But it should give you an idea of whether you'll eventually gain or lose traffic from the move, once that initial Googlebot confusion wears off. Hope this makes sense! Let me know if you have any questions! Kristina
Intermediate & Advanced SEO | | KristinaKledzik0 -
Getting SEO Juice back after Redirect
Thank you for all your answers. EGOL, your link is great and recent. I am removing redirections and inactive product pages are starting to be indexed. Marked your answer as the "Good Answer" Moosa, your idea is great - will propose to my team. Thomas, thank you for the links. Yes, the inactive products post is mine too. The other mainly for activating many pages at once though - also replied to you in there. Cheers,
Intermediate & Advanced SEO | | viatrading10 -
Many New Urls at once
Thanks Egol, I guess our best bet on showing unavailable products would be to focus on related products. Cheers,
Intermediate & Advanced SEO | | viatrading10 -
Inactive Products - Inactive URLs
Thanks you Thomas, we activated them - and will try to improve our "Related Products" and "Availability Notification" section for inactive products. Cheers,
Intermediate & Advanced SEO | | viatrading11 -
Find SEO errors
Thank you, that was useful - I can check noindex/nofollow pages like this with Moz, and images without ALT tags using Screaming Frog. However, the Crawl Test Tool Report doesn't tell me which images don't have ALT tags, neither which pages have images without ALT tags. Is it possible to check this with Moz? Regards,
Other Research Tools | | viatrading10 -
Several 301 Redirects to Same Page
@Umar: Thank you. Url D is related to A/B/C in the sense that D contains all of our products, and A, B and C are (were) product categories. So url D somehow contains the three of them. @Moosa: Yes that's the case. Url A, B and C will find what they are looking for inside url D (they just will have to look a bit more for it). And A, B and C links are clean. I do have some external links pointing to A, B and C, will change them accordingly to D. Thank you both, that helps. Will 301 Redirect these pages. Also, what about adding up link SEO juices? Cheers,
Intermediate & Advanced SEO | | viatrading10 -
Optimize CSS Delivery
The concept that Google is trying to setup here is that your CSS and JS contain elements that are critical for the page to render. The problem is that as the browser downloads them, they can block other resources from being downloaded. This is because the browser wants to read these files to see everything they need to download to render the page. Part of fixing render blocking is to reduce the number of files that a browser has to download, especially those in the critical path (html, CSS, JS) that can block the downloading of other files (images, etc) Google is getting even more specific in your case. They are looking at the "above the fold" parts of your page. What Google wants you to do is take any CSS or JS that you use to render what is "above the fold" on that page and inline that code into your HTML file. That way when the browser downloads the HTML file it has all it needs to render the visible / "above the fold" part of the page vs having to wait for the CSS and/or JS files to download. The problem is that defining "above the fold" is relative due to the multiple browser size and OS and devices that your web server sees on a regular basis. If you have a really good front end developer, they can take the time to figure out what viewport size is the most common and then take all the CSS and JS and inline that (and note this may be different depending on the page) into your HTML (and this assumes that your CSS and JS do not bloat your HTML file size too much). One approach is to take your most common large viewport size and then inline all those items into your HTML that are above the fold so you have everything covered as the viewport gets smaller. The issue there (and this is also with most responsive sites) is that you have a lot of code bloat for your phone browsers. You can also use a sniffer to determine what the size of the viewport is and then having the appropriate amount of CS and JS inlined on the fly. I have also seen people suggesting that we should design websites for the phone first and then expand out from there. This is the best website I have seen that talks about how all these files interact and what Google is really getting at https://varvy.com/pagespeed/critical-render-path.html Here is what I would do. Have a single CSS file for your site and host it on your server, not an external domain. This is best practice. Take the time to strip out all of the stuff you do not use out of the CSS to get the file size down. Minify and compress it, reference your CSS in your header. This may help with the render blocking as you are reducing the number of files requested to just 1, but it may not help with the above the fold render blocking. If you want to move forward with with "fixing" the above the fold render blocking. Extract the CSS that is critical to render above the fold items on your site (noting the caveats above) and place it inline within your HTML file and then put the rest in your single CSS file. https://varvy.com/pagespeed/optimize-css-delivery.html Have a single JS file and host it on your server. If there is any external JS try and see if you can host it within your single JS file. Strip out all the JS you do not use to get the file size down. Minify and compress it. If you want to get past the render blocking above the fold item above, figure out what JS is needed to render the page above the fold. Inline that JS within your HTML and then setup a single file for all the other JS and then defer loading of that file using this technique: https://varvy.com/pagespeed/defer-loading-javascript.html I noticed your external JS file to Googleadservices. You may not be able to put that JS into your main JS file and have to keep the external JS reference. I would then try and defer the loading of that using the technique above. You need to do some testing to make sure that doing this does not throw off how your ads are displayed or tracked. I would also make sure your GA or other web tracking JS code is inlined as well, otherwise you risk throwing off your web stats. This is what makes all of this tricky. The Google page speed tool is just looking at a list of best practices and seeing if they are present or not. They are not looking to see if your page is actually getting faster or not, or if you change any of these things if they throw off the function of your site. https://developers.google.com/speed/pagespeed/insights/ PageSpeed Insights analyzes the content of a web page, then generates suggestions to make that page faster This is why with all of this you need to use a tool that shows actual page speed and will show a waterfall chart with timings to see how everything interacts. webpagetest.org is a common one. It gets really complicated, really fast, and this is where a a really good front end guy or gal is worth it to look at these things. I would start with my initial simple suggestions above and not sweat the above the fold stuff. Test your site with actual speed and see how it does. You can also setup GA to give you page speed data. You can then decide if you need to take it to the next level. Another thing you can try (I have not been able to get this to work for me) is that Google has a tool that can do all the "above the fold" inlining and other speed tricks for you https://developers.google.com/speed/pagespeed/module/ Just like above, I would benchmark your performance and then see if this makes a difference on your site. Good luck!
On-Page / Site Optimization | | CleverPhD0 -
Focus Keyword
I agree with Bob, and I want to add that one-word keywords tend to be _extremely_difficult to rank for. Your best bet is to focus your pages on the key phrases that make the most sense based on the topics of those pages. Have you ever read Cyrus Shepard's "Keywords to Concepts: The Lazy Web Marketer's Guide to Smart Keyword Research?" It's a bit on the older side, but is still quite relevant. It might help you wrap your head around some of the intricacies of keyword targeting, and how search engines are learning to understand topics and intent.
Keyword Research | | MattRoney1 -
Clear & start over with On-Page Optimizer
In addition to Ryan's awesome information, we are working on adding more flexibility to the on-page tool which we hope to have complete soon. No hard ETA but I hope it will be ready in a few months. Cheers!
Other Questions | | DavidLee0