Questions
-
Directory Structuring - Im so confused what to do...
I wouldn't go beyond 2 levels deep on any sub-directory except on rare occasions when it's still a short URL. The directory structure does not matter so much for SEO if done correctly. However, with WordPress, the breadcrumbs, navigation and site-wide links will end up with poorly prioritized interlinking. This could be bad if you do have important pages in sub-directory structures such as yourself. For example, your 1st level directory pages such as /guides/, /treatments/ and /social/ are cannibalizing page authority simply because they are linked to more (and in better positions) than your Men's Guide page is linked to. I recommend beefing up the internal links to the Men's Guide page and improving a lengthening the content on that page as you mentioned you plan to do. Instead of linking to the page with the anchor "Introduction" use "Men's Hair Loss Guide." It's a good idea to keep URLs as short as possible so that your keywords have the opportunity to appear in bold in the URL of a search listing. This makes the listing more attractive. You may also want to consider placing the Men's and Women's Guide at the top level of your menu instead of the sub-menu if possible. I've also noticed you have a lot of old content being indexed with a different website design, such as the About Us page. I'm not sure if this is intentional, but I would suggest migrating it to the new design or redirecting where appropriate. I hope these suggestions are of some use. Feel free to get in touch if you would like to discuss this or more advanced SEO further.
Intermediate & Advanced SEO | | Chris_Hickman1 -
External Keyword Anchor Links - Always Bad?
Keyword rich anchor text on external sites will help your search rankings. But it's also against Google's rules. If an unnaturally high percentage of the links to your site have the same anchor text, you run the risk that Google will ignore their value or will penalize your site. My suggestion would be to get some links with keyword rich anchor text but also get a lot of other links that don't have keyword rich anchor text so your overall link profile looks natural.
Intermediate & Advanced SEO | | Kurt_Steinbrueck1 -
Disavow Experts: Here's one for ya ....
No problem at all, happy to help. Unfortunately the best tools that we have to evaluate these are tools like Open Site Explorer which try to emulate how Google looks at links but they're imperfect for the very same reason that I can't possibly give you a definitive answer: Google doesn't want us to know! Unfortunately, the only way we can ever know the outcome is to implement the change and see if the rankings get better or worse - welcome to the struggles of SEO! If you really can't afford to be taking a hit right now but it would be more acceptable in a month or two (e.g. right now is your busiest period) I'd be inclined to wait. Otherwise, it's a tough call but I'd still lean toward having them removed. Don't forget that Google has been promising a Penguin (backlinks) update "very soon" all year! If that damn update finally rolls out tomorrow you may find yourself getting slammed by it... or it could roll out next year... or maybe it'll roll out and you'll be fine. Sigh. We have had success in doing it steadily with one of our larger clients who were in a similar situation and the results were as good as we could have hoped for but YMMV. We essentially did the removal in stages. We divided the bad domains up into batches then contacted the first batch requesting removal then disavowing. While all this was happening we also got to work building quality links to the site as well so they roughly cancelled each other out. Then we did the same thing with the other batches of bad links until we'd been through the lot. For us, the end result was a series of fairly marginal peaks and troughs that directly correlated with link removal and link acquisition so the net position at any given time was approximately the same. I must stress though that YMMV here - since I have a total data sample of 2 domains (this client has 2 companies/sites), it's impossible for me to say with absolute certainty that what I saw is the direct result of our process.
Intermediate & Advanced SEO | | ChrisAshton0 -
Disavow files and net com org etc ....
Keep it as precise as possible, whether you disavow the whole domain or not is your choice, no problem doing that if you need to. However if you are sure there is literally only one link on the site it is probably advisable to only disavow the specific URL. The best of my knowledge there is no problem with disavowing a full domain. As ThompsonPaul as said if you disavow the whole domain it will affect the entire domain including subdomains so treat with respect. The other thing is be sure the link is doing you harm before you remove it, I have seen are even so-called spam links knock a site by a few points if it's disavowed.
Intermediate & Advanced SEO | | seoman100 -
Your Opinion: Thin Content? Should we Retire this section?
Sounds great! Let me know what impact you see from the changes, curious to hear!
Behavior & Demographics | | Joe.Robison0 -
Pingbacks and Trackbacks: A good source for Disavow links?
Hi, It might be a little counter-intuitive to enable ping-backs and trackbacks to find the spammy sites only to have to disavow them, when disabling this function means that they are unable to do a ping-back or trackback in the first place. I personally wouldn't go looking for spammy backlinks unless you feel they are causing you a problem. Keep an eye on your Search Console for the links that are appearing and assess each one to see if you think it should be disavowed. -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
Suggestions on Link Auditing a 70,000 URL list?
Hi! - I wrote this guide a few years ago on penalty recovery which may help you as it contains a lot of methods around auditing the links - https://moz.com/blog/ultimate-guide-to-google-penalty-removal If we were to approach a product with 70k URLs. We'd do the following steps: Pull all the URLs into a Spreadsheet Split the URLs into domains Filter the URLs are search for common spammy words. e.g 'Link', 'Best', 'Free', 'Cheap', 'Dir', 'SEO' etc (mark as spam accordingly) Run contact finding across all URLs using a tool such as URL Profiler with Whois Lookups Filter by contact name and find duplicates (mark as spam accordingly) Filter by website type and mark as spam accordingly Manually check remaining links By working through by domain, you'll rule out thousands of spammy links very quickly. Though 70k will ultimately take a few solid days of work. Hope this helps, Lewis
Intermediate & Advanced SEO | | PinpointDesigns0 -
Canonicals Passing Link Juice?
A Canonical is kind of like a Bots-Only 301 redirect. So, from a purely mechanical perspective, using a canonical can pass link equity to your other page without redirecting Users off of the forum thread. Now, this would be a deceptive use of the Rel=canonical tag and the bots would stop respecting it on those pages. Since a canonical is a suggestion, not a directive, if the bots think that your canonical is improper, deceptive, incorrect, etc. then they can just stop following it. Ultimately, using a canonical tag in the manner you're thinking wouldn't work out the way you would want it to. You might be able to pass equity from the one page to the other for a time... but that would not be a proper or best practices use of the tag and it would not have long term effects. You'd be better served by looking at updating/expanding your content, internal linking, and backlink profile. And take a look at the article that Andy linked to in his response.
Intermediate & Advanced SEO | | MikeRoberts1 -
Our Journey back to Good Rankings.
Update: In addition to the following happening on our /shop/ subdomain (the bread and butter of the site): 1) Stupidly moving it to shop.domain.com for 2 months, redirecting everything there, then deciding to move it back to domain.com/shop/ .... 2) Developer failing to enable canonicals resulting in the new shop install having 4+ duplicate pages for every product for about 5 months. I have now found that the default setting for Magento store software is a 302 redirect for the 'Auto-redirect base URL' option. Our base URL changed from HTTP to HTTPS. This means that probably for about the last 9 months, our store home page has been 302'd (no link juice passing, and way too long to use a temporary flag like this). This 302 is Magento's default option, and my developer failed to point out the devastating effect it could have on rankings if we didn't change it to "301". Not sure if this has played a role in our lost rankings, as our store is just a sub-section of our site, and I have no idea how I am going to fix this and tell Google "Wait! Here's a 301 instead! Please restore our juice!"
Search Engine Trends | | HLTalk0