Questions
-
How to carry across/capture linkjuice during an SEO site migration
This is exactly the right answer! Also remember that unless the content at the redirect origin and destination URLs is similar, Google may decide not to transfer the SEO authority across. So if you looked at the last active iteration of the old URL which 'earned' the SEO authority (and links), and you compared the content (via something like a Boolean string similarity tool) to the new URL - and the read-out wasn't good, you might lose a little (or all of) your SEO authority The best thing to do is actually get the backlinks amended, if you possibly can - as this circumvents the whole problem. That being said it can be time consuming to do link amends and what you actually get back can be iffy (some webmasters can even get annoyed if not approached correctly)
Intermediate & Advanced SEO | | effectdigital0 -
Backlink audit - anyone know of a good tool for manually checking backlinks?
Hi Luke, Moz Pro has a good backlinks tool. https://moz.com/products/pro Also, Ahrefs has a pretty substantial backlinks tools https://ahrefs.com/backlink-checker As does Majestic https://majestic.com For free tools you can use SEO Powersuite which you can run locally on your PC. I would make sure you check the settings on this though and read the user guide. It is a pretty powerful suite of tools. https://www.uk.link-assistant.com/features-and-editions.html The hoth also lets you check your backlinks for free, this is powered by Ahrefs I think. https://www.thehoth.com/backlinks-checker/ I hope this helps.
Intermediate & Advanced SEO | | MrWhippy0 -
Site structure / IA out of balance? What does that mean to SEO?
Hi Steve and thanks for the feedback - it would definitely be interesting to check - I can't imagine this is a huge issue on uncomplicated sites without thousands of pages, but who knows... testing is needed. All the best, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Should a menu work when JS is disabled? Is that best practice
thanks Dmitrii - It always seems to make things simpler if devs sticks to my guidance on JS - ensuring all links (inc menus) and all static text is visible whether JS is switched on or off - sadly, the often miss this bit!
Web Design | | McTaggart0 -
Website structure - best tools to analyse and plan, visually
Screaming Frog also has a basic, useful site visualisation capability built into it.
Intermediate & Advanced SEO | | ThompsonPaul0 -
Disallow: /sr/ and Disallow: /si/ - robots.txt
Thanks Tomas and Mike - good advice - I have done that and found legacy stuff they've since moved away from - there is indeed no current use for the directives. I wonder whether there's any resource on the web that lists all robots.txt directives - and interprets them - if not then perhaps it would an idea for Moz?
Web Design | | McTaggart0 -
What should my main sitemap URL be?
Yes there is no problem with that, in fact redirect to a list of sitemaps, you can make the test
Intermediate & Advanced SEO | | Roman-Delcarmen0 -
Should you automatically resolve URLs with extra trailing slashes added by accident?
Well yeah sure...but why not fix it in the first place? Too many redirects are not a good idea
Intermediate & Advanced SEO | | andy.bigbangthemes0 -
Underscores, capitals, non ASCII characters in image URLs - does it matter?
It's not best practice for sure, and it doesn't help with SEO (ideally, you want clean, clear, descriptive URLs and paths). That said, if Google can index the content of the page (or the image), it's not a dealbreaker. You can check by using the URL/image path in Google Images e.g. this one.
Intermediate & Advanced SEO | | randfish0 -
Changing title tags - any potential issues?
We agree with David above, if you are improving the title tags and keep them all unique and original from the other pages on your website doing a "Title Tag Cleanup" or going through and optimizing them should be a benefit. Some tips on Title Tags: Have The Keyword You Want To Rank For Be As Close To The Beginning As Possible Try to Make it short enough to were Google doesn't cut it off with "..." We use (https://www.portent.com/serp-preview-tool) Don't Duplicate Title Tags Across Pages, Try To Make Each One Unique Instead of Repeating A Keyword In a Title Try Using An "LSI Keyword" (For Example if you are attempting to rank for "marketing", use "marketing" once and then if desired, use a keyword like "advertising" or "media" etc.) Hope this helps Luke and best of success!
Intermediate & Advanced SEO | | LureCreative1 -
HTTPS - implementation question
Hi Luke, No there is no logical reason to not use 301 redirect and even I would say it is a terrible mistake that will affect site very badly. In fact if we don't use 301 redirect site will loose ranking /traffice/ link juice. Hope this helps!! Thanks
Intermediate & Advanced SEO | | Alick3000 -
How old is 404 data from Google Search Console?
Hi Luke, It's a long time, unfortunately. Most of the 404 errors that I usually see in our Google Search Console properties are the ones that have been in there for ages. As you're dealing with bigger sites (+ 1 million pages) usually this is something you can't easily get rid of. For now I mostly tend to ignore them and try to focus on the crawl errors that come up during crawls via ScreamingFrog or Deepcrawl. Martijn.
Intermediate & Advanced SEO | | Martijn_Scheijbeler0 -
Robots.txt wildcards - the devs had a disagreement - which is correct?
Thanks Logan - much appreciated, as ever - that really helps - if I was to add another * to **Allow: /?resultspage= > so **Allow: /?*resultspage= - what would happen then? ****
Intermediate & Advanced SEO | | McTaggart0 -
MozPerks
Thanks for your feedback Kristina - I ended up rediscovering addshoppers in this instance, which I think I first discovered via Moz quite a while back
Feature Requests | | McTaggart0 -
Is robots met tag a more reliable than robots.txt at preventing indexing by Google?
Hi there, Regarding the X-Robots tag. We have had a couple of sites that were disallowed in the robots.txt have their PDF, Doc etc files get indexed. I understand the reasoning for this. I would like to remove the disallow in the robots.txt and use the X-robots tag to noindex all pages as well as PDF, Doc files etc. This is for a ngnix configuation. Does anyone know what the written x-robots tag would look like in this case?
Intermediate & Advanced SEO | | Bobbi_Tschumper1 -
Product search URLs with parameters and pagination issues - how should I deal with them?
Hi Zack, Have you configured your parameters in Search Console? Looks like you've got your prev/next tags nailed down, so there's not much else you need to do. It's evident to search engines that these types of dupes are not spammy in nature, so you're not running a risk of getting dinged.
Intermediate & Advanced SEO | | LoganRay0