Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
What are risks people are seeing with Widget links?
If I was you I wouldn't take the risk, i have had a few clients have this as an issue in the past. As I don't know your link profile I can't say for sure but a good link profile should be made up of all different types of links. The risk here is these links are not earned in googles eyes or in some cases other websites don't know they are giving the link back to you. I always work off the idea that if I feel like there is a risk it's not Worth it. You might get away with it from a algorithm point of view but you just need a manual review to happen and it's going to be a lot harder. Wish you the best of luck with it.
| aarongray1 -
Moving to https with a bunch of redirects my programmer can't handle
Have you reached out to plugin support people?
| TheSymmetran0 -
Old URLs that have 301s to 404s not being de-indexed.
I don't think 404 vs 410 is the answer here.The basis for this thought is the following: https://searchenginewatch.com/sew/how-to/2340728/matt-cutts-on-how-google-handles-404-410-status-codes ======== "if we see a page and we get a 404, we are gonna protect that page for 24 hours in the crawling system, so we sort of wait and we say maybe that was a transient 404, maybe it really wasn’t intended to be a page not found.” “If we see a 410, then the site crawling system says, OK we assume the webmasters knows what they’re doing because they went off the beaten path to deliberately say this page is gone,” he said. “So they immediately convert that 410 to an error, rather than protecting it for 24 hours." ======== I'm thinking the deeper issue is why the 301s are not being respected. If a link points to http://domain.com/badpage and we use a 301 to point to https://domain.com/badpage - shouldn't the crawler (Google or otherwise) respect the 301? Why still index and serve up a page that responds with the 301? To me, this is baffling. If we serve up a 404 or a 410 - either way we are saying "this page is gone" but we're still seeing the original http://domain.com/badpage in the index? Does that make sense? Or is there more clarification required?
| boxclever0 -
What is the fastest way to deindex content from Google?
Excellent answer. Thank you very much.
| RosemaryB0 -
Linking from & to in domains and sub-domains
np, you don't want to assume things. this is what Matt Cutts said in 2012: https://www.youtube.com/watch?v=_MswMYk05tk It is still advisable that most websites out there rather rely on pages and subdirectories for regular pages and content than subdomains, but nowadays it is a highly subjective thing, basically suit yourself on it. But keep in mind the simple formula for decision making: Pros v Cons... which outweighs which and why. subdomains can certainly cause you, the owner and web dev, more work and impose a higher "moving parts" risk under most typical circumstances where /page is a viable alternative. also look at these https://moz.com/community/q/link-juice-from-subdomain https://moz.com/community/q/getting-mixed-signals-regarding-how-google-treats-subdomains
| TheSymmetran0 -
Prioritise a page in Google/why is a well-optimised page not ranking
Hi Laura, Welcome to the Moz community! Without a domain to do some further investigation, these types of questions can be some of the toughest to answer because there are so many variables that could be causing the problem. I can't comment on your current site but what I can say is that from experience, if this is happening it's typically going to be one of three things (maybe a combination?): More links pointing to the individual product pages than to the category pages Either no or low quality content on the subcategory pages; basically just showing a big chunk of products and nothing else Poor internal linking with more internal links pointing to those products, the category pages in a separate section of the nav, no category breadcrumbs etc. URL structure ties in here in much the same way. If we think about typical user behaviour, it makes perfect sense that your product pages would be getting more backlinks than the categories. If we're talking about real users in a forum and they want to reference a product you sell, they're going to link straight to that product and the same can be said in most other contexts too. This isn't a problem at all - real backlinks being earned in a natural way is always good. It just means you may need to put some effort into driving strength to the category pages too. Content is a topic that gets so much attention in this industry that I don't think I really need to elaborate here - just make sure there is a decent amount of high-quality content on the pages, styled in a way that doesn't bury the products. UX first! Finally, the internal linking. It's very hard to give advice on this part without seeing the site but basically, search engines tend to look at your nav to get an indication of which pages are likely the most important on the page from left to right, top to bottom. If you have an unusual structure where the category pages are buried down in the drop-downs (or - gasp - not even in the nav because there are buttons on the home page or something!) then you're missing an easy win there. Don't forget that breadcrumbs and URL structure should ultimately match this same layout. Something like website.com/category/product. This is all quite vague but hopefully it makes sense to the site you're working with now. Always happy to take a closer look if you'd like to share the URL here or in PM but I completely understand if you're not comfortable with that too.
| ChrisAshton0 -
Proxy Servers & SEO
Something like that can be done, but you must exercise care to not get slapped for duplicate content penalties. There can only be one place for that content in the eyes of your users and google, meaning the funny proxy server should never itself be visible to public. But if you do this right, you can have the content rendered on your desired page on your site which should get captured by Google properly as well. But if you really want to be sure, test it, setup a test page like this without all the fancy dev work. and have google crawl your site and see what it reports back after it picks up that page. I think you will be ok if you do your research and do not miss your rel canonicals or other countermeasures for duplicate content. However it gets rendered on your /page, make sure google console can show/see the same content there.
| TheSymmetran0 -
Ranking drop for "Mobile" devices category in Google webmaster tools
Thanks for the reply. We been doing good till this drop and suddenly dropped for mobile category. The related changes we made in correlation to this drop are only few redirects and removing footer links. Btw, we are ranking good in mobile speed tests and mobile-friendly as per Google tools.
| vtmoz0 -
Ranking Sub Categories on Ecommerce Site
It may end up too deep to count as much as the main subpage would. But in that case, you can create CLEAN URLs for it, meaning, you can have the CMS or ecom solution render the data on maindomain.com/standard-metal-lockers instead of maindomain.com/lockers/standard-metal-lockers (I am not 100% if this is how it looks thogh btw, if it is as simple as this, it is not "TOO" deep tbh). and then you want to make sure the clean URL is marked as the primary source of data and the longer CMS based URL should have a rel canonical associated to indicate this. I would even try to de-index that original page from results entirely. Think like amazon, the king of ecommerce: Although they have categories and sub and even sub sub categories for products, the url for end product and categories look more like these https://www.amazon.com/**Arctic-Silver-AS5-3-5G-Thermal-Paste**/dp/B0087X728K/ [notice the keywords are followed immediately after .com/, then everything else is configured], you see how there are no categories listed before that and there are internal categorization and tracking items written after the keywords. this is obviously going to be hard to impossible to achieve with standard CMS out there unless they are super SEO friendly somehow which most CMS and especially ecom CMS are not out of the box. So you may have to do some research to see if you can pull it off that way. But the earlier method I described is what I have been able to succeed with previously. Now lets take another look at what they do: Parent: https://www.amazon.com/**Computers-Tablets**/b/ref=nav_shopall_basedevices?ie=UTF8&node=13896617011&nocache=1479275082032 If you click on tablets, you get: https://www.amazon.com/b/ref=lp_13896617011_ln_2?node=1232597011&ie=UTF8&qid=1479275076 (non keyword optimized URL) BUT IF YOU INSTEAD SEARCHED FOR AMAZON COMPUTER TABLETS, ONE OF THE TOP RESULTS IS THIS, see the difference? sub-cat: https://www.amazon.com/**Tablets/**b?ie=UTF8&node=1232597011 these two are literally the one and same page, but I guarantee you the latter is specifically optimized and indexed for search while the former is how the CMS sees the links.
| TheSymmetran0 -
Why some websites can rank the keywords they don't have in the page?
Also who is the guy lol name them if you suspect wrongdoing or unethical behaviors! I know of at least one page 1 ranking site around my locality that ended up converting a student organization domain with a lot of .edu backlinks and then just ran with it for seo and represented itself in multiple irrelevant and across the country large cities. They even blatantly dared to SELL backlinks on their site... in broad daylight! in 2016! Needless to say I reported them to Google and saw them get kicked down a notch to page 2 for my test keywords from earlier searches.
| TheSymmetran0 -
Organic keyword ranking drops across the board?
There seems to have been significant movement in the SERPs around November 10th in the US (which could have hit the UK a couple of days later, although I have no data to support that). Unfortunately, we don't have much in the way of details or confirmation right now. Are you seeing any specific pattern in the loss? I'm not quite sure I understand what the graphic is showing.
| Dr-Pete0 -
Redirect HTTP to HTTPS
Yes - for the simple reason that from January 2017, Google Chrome will start marking all sites that do not have https active on sensitive pages as insecure - this is a very important step. As is the way with Google, this 'gentle' shepherding is surely a forewarning of a move to 'https is mandatory' before long... https://security.googleblog.com/2016/09/moving-towards-more-secure-web.html "Eventually, we plan to label all HTTP pages as non-secure, and change the HTTP security indicator to the red triangle that we use for broken HTTPS." So really, you have no choice, and honestly, it's not so bad. We just moved our entire site to https without any issues.
| alwaysriding0 -
Pagination & View all option on Ecommerce site
Hi Thank you for the reply. Would you consider a view all option a good pagination experience or is there something else I should be looking at? Becky
| BeckyKey0 -
Anchor text optimisation
For internal links, just keep them relevant so people know what the link is before even clicking on it. If you have a guide on how to write page titles, make the internal anchor something like "guide to page titles". As for external backlinks, if they're editorial links that someone else is posting, it's usually best to just leave this up to the site owner since they're going to make it relevant anyway. The best way to have a natural looking anchor text profile is to make it natural! If you have to control the anchors (e..g directories or a guest post) then we typically lean toward this priority order: Your brand name Your URL Your primary keyword(s) That means that if you did web design and the name was "ProWeb", your anchor text profile would probably look something like this, in order of highest to lowest volume: ProWeb Pro Web http://www.proweb.com www.proweb.com Web Design Website Design webdeign You will get all kinds of variations like mis-spellings, "read more", "click here" etc as well but you get the picture. I hope that helps! Just make sure you don't have your keywords as the most common anchors - it's pretty obvious spam!
| ChrisAshton0 -
Silly Question still - Because I am paying high to google adwords is it possible google can't rank me high in organic?
The other comments here are correct; Adwords will not alter your organic rankings. The issue here may be something else entirely. You mentioned that your onsite factors are better than the competition and both your site and your top competition have few backlinks. It sounds like that should basically put you on an even playing field there roughly (though it depends on link quality too) so perhaps there are some holes in your onsite that could be addressed. You also mentioned that your engagement metrics like time on site and page views are better than the competition. Assuming you're getting this info from something like SEMrush, it could simply be that you're relying on inaccurate data to make your decisions. SEMrush is great to give basic information but it isn't 100% accurate. If you're comfortable dropping a link to your website I'd be happy to take a much closer look for you and offer some helpful feedback? If you don't want that link public you can PM me if you prefer.
| ChrisAshton0 -
I'm stumped!
Thanks, Bernadette. The consolidation has been done, and surprisingly seems to correlate with our dropping visibility. Do you have any experience with a "consolidation" that might shed some light?
| LindsayDayton0 -
Javascript search results & Pagination for SEO
Hi Jake Thank you for your input. I've looked in what might be blocked & I still have a couple of .js files blocked, they're AJAX e.g. http://www.key.co.uk/wcsstore/dojo18/dijit/nls/loading.js http://www.key.co.uk/wcsstore/dojo18/dojo/fx/easing.js These aren't in the robots.txt files - do you think it's worth finding & unblocking them? Thank you
| BeckyKey0 -
Duplicate H1 on single page for mobile and desktop
Hey Ryzippy, Google really doesn't want to penalize site owners for making web design decisions without taking their crawler into account, so I don't think there will be a problem. That said, I'd recommend that you upload the new design to a test group first, just in case. This is not a risk-free solution, though, so I'd also recommend that you push back on your web developer. Is there really no way to use the same H1 for both the mobile and desktop design? This is exactly what responsive design is for, to use the same HTML elements, but different CSS based on the screen width. Anyway, good luck, and check back in here to let other SEOs know if using multiple H1s will cause a problem! Best, Kristina
| KristinaKledzik0