Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
SEO - Use pages on main site or set up outside keyword rich domains and websites
Hi Ricky, Rand did a great Whiteboard Friday on 'How to Optimize for Competitors' Branded Keywords' in February this year that I think you'll find helpful Both options you suggested could be successful. I think it really comes down to two things: How much your client is willing to spend. If they want their company site openly targeting competitor terms. If they have no issue with openly targeting competitor brand names, I'd say do it on the current site. This will also be the cheaper option. If they want it to appear as an anonymous product comparison, obviously, you will have to build a separate site - which will require more budget. You don't need to worry about a Google penalty with either of these options. Cheers, David
| davebuts0 -
301ing Pages & Moving Content To Many Other Domains
This is a great metaphor: "Finally, will moving all of this content damage Site A? Yes. This is cutting out body parts similar to arms and organs. When this content leaves the traffic flow into Site A will drop. The number of linking domains and pages will drop. The offer of this content to entertain existing visitors will be gone. The size of that loss will determine the impact. Rankings of remaining content might fall if the loss is great. If arms and legs or heart or brain are extracted then expect Site A to suffer. But if lesser things are lost then the damage will be lower but some damage will happen. Search engines and visitors will all notice. Enthusiastic visitors will find the content in its new home and they might move with it." Will definitely be using this for future explanations!
| Joe.Robison1 -
How to optimize count of interlinking by increasing Interlinking count of chosen landing pages and decreasing for less important pages within the site?
Hey Anirban - internal linking is definitely a good tactic for the right sized site. There's a pretty good overview of internal linking over on this Quicksprout video, one highlight: "But, to find out how to get the most bang for your buck, you want to find the pages that have the most authority. Those are the ones that you want to link from to the pages you’re trying to rank. Let me show you what I mean. We’re at Open Site Explorer. What I did is, I put in the home page of my site and pressed the search button. When you’re here, it’ll automatically rank inbound links coming from other sites, but that’s not what we want. We want pages from within our site. So, click on the “Top Pages” tab. When you do that, you’ll see the page authority of all the pages on your site. Starting from highest to lowest, these are the pages that you want to build internal links on, pointing to the pages that you actually want to rank." Brian notes that you can automate some of it if you have thousands of pages, but you shouldn't rely on it too much since there's not enough anchor text diversity: "I’m not a huge fan of doing the automated, because if you do it like this, you’re not going to have a lot of anchor text diversity which may get you dings for over-optimization. If you have a site with thousands of pages, you will need to maybe do some automation and at least have some of your internal links done by a plugin or some piece of software." This Moz Academy lesson also has some good tips on internal linking: Make links relevant, and useful Don't stuff keyword into anchor text too much, and don't overuse footer links For huge sites, you can use sitemaps well and include them in the footer (see 5:00 into the video) I recently wrote up a summary of ecommerce internal links best practices, but the ideas apply to non-ecomm sites: Link product pages together via the description if they're related Tell a story in blog posts of your products and link to those product pages Link blog post content to relevant category pages (underutilized) Make category pages more user-friendly with content, and link liberally to relevant internal pages Use user-generated content/curation pages to help you naturally build internal links at scale Hopefully these ideas point you in the right direction, let me know if you have any followup or more specific questions! If you haven't used Screaming Frog's SEO Spider to scan your site and show internal linking pages, that can be another great tool to help you out.
| Joe.Robison0 -
Website dropped out from Google index
"P.P.S. Just noticed that there is noindex x-robots-tag in headers" That will do it. You are telling Google to take all of your pages out of Google. You set that at the web server level and so you will need to get into your apache or nginx setup https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag Get on this ASAP!
| CleverPhD0 -
Directory Structuring - Im so confused what to do...
I wouldn't go beyond 2 levels deep on any sub-directory except on rare occasions when it's still a short URL. The directory structure does not matter so much for SEO if done correctly. However, with WordPress, the breadcrumbs, navigation and site-wide links will end up with poorly prioritized interlinking. This could be bad if you do have important pages in sub-directory structures such as yourself. For example, your 1st level directory pages such as /guides/, /treatments/ and /social/ are cannibalizing page authority simply because they are linked to more (and in better positions) than your Men's Guide page is linked to. I recommend beefing up the internal links to the Men's Guide page and improving a lengthening the content on that page as you mentioned you plan to do. Instead of linking to the page with the anchor "Introduction" use "Men's Hair Loss Guide." It's a good idea to keep URLs as short as possible so that your keywords have the opportunity to appear in bold in the URL of a search listing. This makes the listing more attractive. You may also want to consider placing the Men's and Women's Guide at the top level of your menu instead of the sub-menu if possible. I've also noticed you have a lot of old content being indexed with a different website design, such as the About Us page. I'm not sure if this is intentional, but I would suggest migrating it to the new design or redirecting where appropriate. I hope these suggestions are of some use. Feel free to get in touch if you would like to discuss this or more advanced SEO further.
| Chris_Hickman1 -
Print pages returning 404's
It sounds like you may be using the WP-Print plugin. This issue has been resolved by others by regenerating the permalink structure. Go to WP-Admin -> Settings -> Permalinks -> Save Changes Let me know if that works for you. Best of luck.
| Chris_Hickman0 -
AMP pages for a responsive Ecommerce website?
From my reads and attending Google hangouts with their AMP engineers, I take it that Google intended for AMP pages to be used for eCommerce from the inception of the project. You may find it helpful to read Using AMP to Reach Mobile Buyers and Getting Started with AMP for E-commerce.
| jessential0 -
Low quality links on non www version only?
Hi Jason, Yes, you should remove them. Assuming you have non-www to www redirects in place, those links are still resolving and therefore counting against your site.
| LoganRay0 -
No designated 404 page, but any made-up URL path displays homepage Good / Bad?
URLs that don't exist should display an error page and return a 404 (not found) response. When you have made-up URLs going to an actual page that returns a 200 (OK) response, the problem actually is that Google will start seeing this as soft 404 errors, which is not good. From Google Search Console: "Returning a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404) can be problematic. Firstly, it tells search engines that there’s a real page at that URL. As a result, that URL may be crawled and its content indexed. Because of the time Googlebot spends on non-existent pages, your unique URLs may not be discovered as quickly or visited as frequently and your site’s crawl coverage may be impacted..."
| Linda-Vassily0 -
Many New Urls at once
Thanks Egol, I guess our best bet on showing unavailable products would be to focus on related products. Cheers,
| viatrading10 -
External Keyword Anchor Links - Always Bad?
Keyword rich anchor text on external sites will help your search rankings. But it's also against Google's rules. If an unnaturally high percentage of the links to your site have the same anchor text, you run the risk that Google will ignore their value or will penalize your site. My suggestion would be to get some links with keyword rich anchor text but also get a lot of other links that don't have keyword rich anchor text so your overall link profile looks natural.
| Kurt_Steinbrueck1 -
How can I fix a SSL that is redirecting 3 times?
I don't think so.. I would let it be.
| HiveDigitalInc0 -
How to solve our duplicate content issue? (Possible Session ID problem)
Hi, This is a pretty simple fix - all you need to do is add a canonical tag that points to the root version of the URL. For example, on https://www.mrnutcase.com/en-GB/designer/?CaseID=1128510&CollageID=21&ProductValue=3364, add this canonical tag:
| LoganRay0 -
Directory links with no follow
Hi Thanks for this, I haven't used a scraper before, I'll have a look at it thanks
| BeckyKey0 -
Broken images & Rankings
Great thanks - it hasn't affected everything, just a few things. I'll give it a go and get back to you Thanks!
| BeckyKey0 -
Structural data in google webmaster tools
Hey Beau Thanks for your reply. We never received anything from Google in Google Webmaster tools. One day the site was gone in the serp with no explanation. That was on 11 August. Since then we have done a lot of things in Google Webmaster tools to see if that would help. There were no major things, but we have done everything we can and no there are nothing left to do. Yesterday, the site as back on track in Google. I can't say why, but positive thinking (ability to look for solutions and not anybody to blame) I believe are the most important reasons why we are back. We have not liberally used anything to mark up our site with structural data before. The only thing we could find that might have helped us with structural data was a plugin Author H review which I find very good. However my intention is so structure up the entire site with structural data now. I read the recommended article about structural data (thanks, it was really interesting). I'm no experted in it, nor is my web developer. Hence I have been thinking of contacting a Schema.org expert. What do you think, would that be a good idea? The more I read about schema.org the more it triggers my interest and the more I realize that it could be good to get some help. In google webmaster tools we still have 4 errors regarding Structural data. However when we test it in their own tools everything works. IN the control panel we have added http: https: www.http www.https. For some reason Google shows different results regarding indexed pages and structural data and site maps depending which of the http versions you're looking at. The structural data is now back and we have managed to increase it. Thanks a lot for your hel Beau. Your questions and answers helped us to start looking more at structural data. It was a great relief yesterday when we were back in the search results again. I hope it will stay that way. Do you think it's worth spending time and resources to contact a schema.org consultant who can help us mark up the entire page with structural data? Have a nice day! Anders
| Enigma1230 -
Disavow files and net com org etc ....
Keep it as precise as possible, whether you disavow the whole domain or not is your choice, no problem doing that if you need to. However if you are sure there is literally only one link on the site it is probably advisable to only disavow the specific URL. The best of my knowledge there is no problem with disavowing a full domain. As ThompsonPaul as said if you disavow the whole domain it will affect the entire domain including subdomains so treat with respect. The other thing is be sure the link is doing you harm before you remove it, I have seen are even so-called spam links knock a site by a few points if it's disavowed.
| seoman100 -
Canonical URL Multidomain Geolocation Based
Hey It depends on a few factors here: 1. What is the desired geographic target for each of these? I am assuming the .au and .co.uk are for Oz and UK so geo target them (auto) in webmaster tools and sign those off 2. The .com is a generic so could be competing - ensure you geo target this to the US if that is the desired geography. This ensures they are not competing with each other and there will be no duplication issues but is based on the US, Australia and UK assumption. Hope that helps Marcus
| Marcus_Miller0 -
Disavow Experts: Here's one for ya ....
No problem at all, happy to help. Unfortunately the best tools that we have to evaluate these are tools like Open Site Explorer which try to emulate how Google looks at links but they're imperfect for the very same reason that I can't possibly give you a definitive answer: Google doesn't want us to know! Unfortunately, the only way we can ever know the outcome is to implement the change and see if the rankings get better or worse - welcome to the struggles of SEO! If you really can't afford to be taking a hit right now but it would be more acceptable in a month or two (e.g. right now is your busiest period) I'd be inclined to wait. Otherwise, it's a tough call but I'd still lean toward having them removed. Don't forget that Google has been promising a Penguin (backlinks) update "very soon" all year! If that damn update finally rolls out tomorrow you may find yourself getting slammed by it... or it could roll out next year... or maybe it'll roll out and you'll be fine. Sigh. We have had success in doing it steadily with one of our larger clients who were in a similar situation and the results were as good as we could have hoped for but YMMV. We essentially did the removal in stages. We divided the bad domains up into batches then contacted the first batch requesting removal then disavowing. While all this was happening we also got to work building quality links to the site as well so they roughly cancelled each other out. Then we did the same thing with the other batches of bad links until we'd been through the lot. For us, the end result was a series of fairly marginal peaks and troughs that directly correlated with link removal and link acquisition so the net position at any given time was approximately the same. I must stress though that YMMV here - since I have a total data sample of 2 domains (this client has 2 companies/sites), it's impossible for me to say with absolute certainty that what I saw is the direct result of our process.
| ChrisAshton0