You can't beat the SEO Log File Analyser from Screaming Frog, imo:
Posts made by TomRayner
-
RE: Which is the best tool to analyze server log file?
-
RE: Discourage search engines from indexing this site AFTER a site launch
Hey there
Presume you're using WordPress here. From my past experience - no, that won't have a long-term detrimental effect on your site's ability to rank, once the site goes live.
If you're concerned, however, you could install a "construction" or "coming soon" page, which will allow the site to index, but prevent other URLs from being found/crawled (so long as you don't submit a sitemap until you're ready).
Seedprod's free plugin is highly recommended, and I've used it before to good effect: https://en-gb.wordpress.org/plugins/coming-soon/
Hope this helps.
-
RE: Is anyone else's ranking jumping?
This may have something to do with Google's recent change to show results as country-specific by default, rather than whichever TLD you use (.co.uk, .de, .fr etc).
This is causing a few rank checkers to throw off a few wild results. I've seen all the major ones be affected by this.
If you can't recreate the results and traffic is normal, don't worry too much, as the software people will be making fixes soon.
FYI - if you want to get round Google's change and still get specific results from a specific country, you can add:
&gl=us
&gl=uk
&gl=fr etc.To the end of your query string. Replace the country code with whichever you need.
-
RE: Overlay / modal for product pages - bad or good for SEO?
Hey Arnaud
I think is is definitely worth testing for UX and CRO purposes, and I don't think you'll do your pages "harm", from an SEO POV.
If the overlays appear onclick when in the category, and the rest of the category page is readable and crawlabe, it shouldn't cause any problems.
What's great is that you've already considered how you could rank those individual products themselves by giving them their own URL. Those might struggle a bit though if the URLs are not linked to directly from within the product category silo, or elsewhere on the site.
However, I don't see that as necessarily a bad thing. Unless you have a specific product type that sells very well and has significant search volume itself, I'd wager that most of your inbound organic traffic would best be served by the category pages anyway (IE, if searching for blue widgets, the category shows all the widgets you have, not just one type). That itself is more likely to match the user intent of those people entering your sites.
I would just ensure that you nail the tech and onsite aspects of those category pages - and the rest should be fine.
-
RE: Redirecting acquired website: DNS or 301?
I'd go ahead and 301 redirect those websites and pages.
With a 301 redirect you will also pass on any link equity the infringing websites once had, which in turn may help your organic ranking performance.
However, in order for that to happen, you need to ensure that you redirect the individual pages on those websites to the most relevant/equivalent versions on your own. Otherwise, you may see those 301 redirects treated as soft 404 errors.
Hope this helps.
-
RE: Are we confusing Google with our internal linking?
Possibly. Internal links and their anchor text can certainly give Google a priority on what the page is about, and which preferred landing page you want to rank.
However, there can be more to it as well. How much does the sub-page 'talk' about the keyword? What is its content like? Do you have any canonical issues? How about the homepage, how much content is on there about the keyword.
You could be cannibalising efforts by having a number of pages all talking about the same thing. Content is quite often just as 'confusing' to Google as the internal links.
-
RE: Use hreflang for language and regional URLs
Hi there
Looks like when you've been adding the code, you've pasted in special characters or something like that. This is why you're getting the “en-sg" error.
If you look at the source code in Chrome, you can see the code is italicised:
http://i.imgur.com/IfTwvpj.png
What I'd do is retype the code manually, or remove any special formatting from the text before adding it into your HTML. You want to just get the raw quotation mark in there ("), rather than the special character.
Hope that makes sense
-
RE: Meta Descriptions - Does Cutt's comment still hold true?
I think Matt's point was that he'd prefer to let Google generate meta descriptions - in the hope they'd be unique - rather than using the same one repeatedly.
My view? You should be writing unique meta descriptions for all of your key landing and sales pages.
Not for SEO - far from it. But because it's the shop window that people will see before they reach your site via organic search. You have a great opportunity to make it compelling, to attract the click away from others around (or above you) and to get your unique selling points across.
Google can never do a better job at that than you.
-
RE: Does type of hosting affect SEO rankings?
Hi Mark
It certainly can make a difference, for some of the reasons that you've alluded to. If we break hosting's influence on ranking down to its 3 core factors, we have:
- Security
- Location (which ties into:)
- Speed
Security is pretty basic - the more at risk your site is to being hacked, the bigger the risk of that hack happening and a drop in rankings occurring (which can then be very hard to regain). If you're managing the server yourself, make sure you take all of the necessary steps. If you're using managed solutions, vet the provider as much as possible.
Location - this is a two-fold factor. There is some correlation (albeit it is not a big one) that if you had two equal websites, one in the UK and one in the US, and you're trying to rank in Google UK, the site hosted in the UK might rank a bit better. It's not huge, but worth keeping in mind. What is more of a factor in server location is the actual location of the server itself and where your typical users (or target users) are based in the world. The closer your users are to your server's datacentre, the faster the server response is likely to be. Faster websites are happy websites. That leads us to the main point on:
Speed. As a (very general) rule, the more allocated RAM and bandwidth, plus the greater server's processing capabilties, the faster it is likely to be. Typically, this means that dedicated servers (where it's you and only you taking 100% of the server's resources) will perform better than a dedicated VPS (100% of the resource, just less resource), shared VPS, or shared hosting. The bigger your site becomes and the bigger bandwidth footprint it creates, as it were, the more you'll need a server that can handle that. **There is a good correlation between site speed and organic rankings. **If the server is slowing you down, it could be holding you down as well.
But servers are just the start - there are a number of server and site configurations that can have a big impact on site speed - stuff like a Content Delivery Network (CDN), Gzip compression, image size. Swing over to GTMetrix and add in a URL and you'll get a speed analysis, plus tips on how you can improve your speed score and overall site speed.
Hope this helps.
-
RE: What to include in my report ?
Speaking from both a producer and a recipient of these reports:
The most important things for me are:
- Revenue
- Conversions
- Traffic
In that order.
For any SEO report, I would be expecting to see, above all else, how much organic visitors (or people who have visited from organic search in some part of their user journey) have generated in terms of revenue. Some companies model this entirely within Google Analytics, others may have a CRM that stores the customer financial data that you'll need to sync up with analytics. But basically, show them **how much money **organic search has made them last month.
After that, I'd want to see how many organic search visitors converted on the site (bought something, signed up etc). It's worth reporting this separately from revenue, as you can get an idea of which conversions are more "valuable" to you. Show them how many conversions organic search has given them.
Next, look at traffic. This is important to report as you need to show them how much organic search is contributing to their overall traffic, and hopefully show them it is increasing. It can also help you identify other issues - for example, if traffic increases but revenue or conversions do not, is there a problem with the quality of the traffic or, more likely, a problem with the website's user journey or design? Remember, organic search is inbound marketing, and so searchers are actively looking for the thing that drives them to your web page. If you can get organic traffic but can't get it to convert, it is usually a problem with your website's design or UX. Show them how many visitors organic search has given them.
Anything else, to a company receiving an update on how SEO is progressing, is just fluff. My clients don't need to see things like error rate, crawl indexing, canonical reporting - even rankings (a sub-KPI of traffic, but ultimately not what the client is interested in). For a top level report - something that says what has our money got us - I stick with these three principles. If I'm producing a technical audit on the website, then of course you want to give more detail, but for a general monthly report it's overkill.
There are caveats to all of this (such as breaking down branded vs non-branded organic traffic), but this should be the starting point.
Hope this helps.
-
RE: Should I Build A Niche City Site or Link to Existing Directory?
First thing's first:
Is the current city page getting any traffic?
If yes - don't change a winning formula.
If not - let's weigh up the options.
By going with a separate domain, you're making the assumption that you'll be able to target the site specifically for the region + product/service and that users might be more likely to convert on a domain that contains the city (which may or may not be true).
By adding to the existing site, you're helping to reinforce the site's brand and you're collecting link equity in one place. If you have a solid on-site SEO structure, that link equity can be efficiently passed around to rank other pages on your site. Potentially, you'll get a stronger link equity with this method than you would with a dedicated city site, as you have several cities and products all with the potential to win links.
The citynamehomesforsale.com (unless it has historical links) is unlikely to have any SEO equity itself, meaning you'll be starting from scratch? Is that really worthwhile, or would you be better off building up your existing web property and brand, utilising the strength it may already have, and attempt to rank your city keywords that way?
I'd be more inclined to look at that method, unless the site is under a form of algorithmic penalty.
Hope this helps
-
RE: What is Linking C-Blocks
To lift a quote from Keri Morgret in this thread from a few years ago:
"It refers to the part of the IP address that's different. The same class C address means something has the same third octect in the address. In the following, the first three IPs are in the same class C, and the fourth address is not.
192.168.1.1
192.168.1.2
192.168.1.3
192.168.100.4...it's a hint to Google that the sites are all related to each other and on the same server, and that the links may not be very natural since there is the good possibility that the same person set them up."
-
RE: Mozscape Index?
Hi Milan
I can't speak for Moz - but I would imagine that because those links have only just been discovered prior to the update, they weren't actually incorporated in the update itself. The index may have updated on the 15th, but the cut off point for URL entries may have been a few days before, possibly before it had discovered your links.
I'd be pretty confident that now Mozscape has seen your new links they'll be fully incorporated in the next update.
Hope this helps.
-
RE: Robots.txt - Googlebot - Allow... what's it for?
Hi Luke
As you have correctly assumed, that particular robots command would be pointless.
The Googlebot does follow allow commands (while other ones do not), but it should only be used if it is an exception to a disallow rule.
So, for example, if you had a rule that blocked pages within a sub-directory, with:
Disallow: /example/*
You could create an allow rule that indexes a specific page within that directory to be indexed, like:
Allow: /example/page.html
Couple of things to point out here. "At a group-member level, in particular for allow and disallow directives, the most specific rule based on the length of the [path] entry will trump the less specific (shorter) rule." (Google Source). In this example, because the more specific rule is the allow rule, that will prevail. It is also best practice to put your "allow" rules at the top of the robots.txt file.
But in your example, if they have allow rules for JS and CSS files without having disavow rules for those directories/paths etc - it's a waste of space. Google will attempt to crawl anything it can by default - unless you disavow access.
TL;DR - You don't need to proactively tell Google to crawl CSS and JS - it will by default.
Hope this helps.
-
RE: Seeking guidance setting up hreflang en-gb for international english website and en-us for North American site
Hi there
I'd highly recommend going through Aleyda Solis' international SEO posts here on the Moz blog. They can teach how to prepare for international SEO, how to approach site structure and how to generate relevant code and hreflang tags.
Here is her international SEO checklist
Here is her Hreflang blog post and generator tool
And 40 tools to help advance your international SEO
They're great reading and nothing that I'd be able to do add to, so I hope this helps!
-
RE: Disavow backlinks
Hey
At the very least, if you're removing links to try and revoke a manual penalty, showing evidence that you have directly contacted webmasters and had some links manually removed is essential.
Google wants to see you repent attempt to take action against your action. I have not seen one manual penalty removed just by disavowing links straight away. They want to see a documentation of your work, which includes manual outreach.
There is also the possible argument that physically removing a link will be 'quicker' at getting the link out of your backlink profile, with it being reported that a disavow file can take between 3-9 months to be processed. Wouldn't surprise me if Google are trying to speed this up dramatically - but we know that Google crawls, caches and indexes links relatively quickly - and by that same notion it would also process removed/broken links quickly as well. So there could be an advantage there too.
Hope this helps.
-
RE: Best practice for URL - Language/country
Hi Peter
Both are viable options.
I'd highly recommend going through Aleyda Solis' international SEO posts here on the Moz blog. They can teach how to prepare for international SEO, how to approach site structure and how to generate relevant code and hreflang tags.
Here is her international SEO checklist
Here is her Hreflang blog post and generator tool
And 40 tools to help advance your international SEO
They're great reading and nothing that I'd be able to do add to, so I hope this helps!
-
RE: Linkbuilding
Hi Sarmad
On the whole, the health niche is very competitive. You may have found a nice little pocket to target, but the likelihood is that you'll be going up against some big brands and operations, so whoever you choose to work with should be obliged to give you an frank estimation - which may involve a budget bigger than expected.
Be wary of those PMing you promising the world at a low cost. If it sounds too good to be true - it probably is.
-
RE: How should I deal with "duplicate" content in an Equipment Database?
"Ideally, we wouldn't want to exclude these pages from being indexed because they could have some long-tail search value. But, obviously, we don't want to hurt the overall SEO of the site."
You say that, but I'm not entirely sure it's true.
I understand the theory - if you have 20 Citroen C1s listed on the site, you could potentially have 20 pages of yours ranking for relevant terms, right?
Well, unique content on those pages or not, I think it would be extremely unlikely that Google would want to present all of those results to the user. Furthermore, if the pages expire or go "out of stock", as it were, when purchased, would Google want to rank it?
So I'm not convinced having all those pages indexed and treated as unique (whether they are or not) would result in traffic (please prove me wrong though - if you have lots of entrances to the site via organic search to those pages it'll show what I know!).
My preference, regardless of the above, would be to have a main page for your Citroen C1 products - a hub page - that then links to all the different products you have as and when they're available.
This has many advantages - you just need to focus on ranking one page in the category instead of several, you can collect all the link equity you earn to one page, you can ensure the page is well optimised for search engines and users, and the page will be evergreen - meaning your links would be too.
The short version:
Homepage > Hub Page > Product variant 1, variant 2 etc
Rank the homepage and the hub page.
Hope this helps.
-
RE: Adjustable Bounce Rate
Picking up on Dirk saying:
I prefer to know if people scroll to the end of the page (so I assume they have read the article) rather than just put an arbitrary time to fire an event.
This was shared the other day - it's a way of pulling in scroll-depth data into your Google Analytics reports. Incredibly useful: