Based on your response and the idea that you only have 50 pages that you actively care about. Likely it's not the right tool for you. You're probably better off with a tool that is way more focused on on-page optimization for the keywords that are important to your business. I would start worrying about crawl behavior on a site that is scalable and have over 10.000+ pages that regularly change.
Posts made by Martijn_Scheijbeler
-
RE: Thoughts on Botify?
-
RE: Should I noindex pages on my website that are pulled from an API integration
Hi,
I don't see any big problems with an API integration like this. There is a lot of companies that are using data from other content providers (through an API) that you are in most cases mostly looking at how you can enrich that content. That's why I would leave the pages indexed and make work of enriching the pages with as much other (more unique) content as possible.
Martijn.
-
RE: Thoughts on Botify?
I've used Botify at my previous job (Postmates) and I also brought it on at my current job at RVshare. I think it's a great tool that brings a lot of new things to the table that GA/GSC can't offer at all (and won't offer). If you're interested in learning how making changes to your site have a direct influence on crawl behavio\r by bots and how that can help you drive additional traffic then it's a great plus. That's why on the technical/product side it will open a lot of new opportunities for testing and continuous optimization.
Regarding if it's useful for you, I would be hesitant to buy an expensive tool (you're likely north of $1500 monthly) for a smaller site as there are fewer ways to optimize on that for the scale there is. But if the only 5K pages generate millions of users from Organic Search on a monthly basis and bring in a lot of money then Yes it might be worth it. If you're comparing to Moz/SEMrush etc. Don't they're not comparable and have totally different price points. You use Botify in addition to your current toolset, not as your toolset.
-
RE: How does changing sitemaps affect SEO
Hi Jason,
I wouldn't worry about changing this at all, in the end, the 50K limit that has been put on sitemap is an arbitrary one. So if you keep your sitemaps well under that it doesn't really change anything at all. In the end, the files itself are not a ranking factor, they're being used to become aware of URLs that don't exist on the site or for search engines to be notified of URLs that have been updated (through the last mod attribute). So changing it to 15K shouldn't harm you.
Martijn.
-
RE: XML Sitemap Question!
Hey
Yes, you can safely do that. In the end, you mostly want to make sure that the right pages are being Crawled > Indexed. If that requires certain (XML) files to be no-indexed, then that's the way you want to go.
Martijn.
-
RE: Should i switch from .com to .fr / .de etc
Hi,
The first thing that I would look at in any case is if you're able to use the HREF lang feature. If you aren't that making any of these changes won't help you too much, as the backlinks, in that case, would be spread across the different TLDs. Usually, that doesn't work in your favor too much.
In itself, I would advise a company to stick with the solution that they already have before considering a move to another TLD/folder structure. In this case, I'm having a hard time seeing the upside.
Martijn.
-
RE: Tools for finding quality backlinks
Hi John,
I can make this a very lengthy comment, but the answer to both is: Yes and Yes :). Let me know if you have any specific questions.
Martijn.
-
RE: Hide Cross Domain Rel=Canonical
Hi,
What kind of cross-domain canonical is there at the moment? You could sort of cloak the canonical if you really want to, but I'm having a hard time understanding of what's currently going on before I would suggest doing that and going that route.
Martijn.
-
RE: Html extensions
Hi Julie,
I can confirm what Gaston says, there is no additional value in changing the extension (or adding one) on files. In the end, it's not something that Google pays any attention nor adds any value to. That's why I would rather focus my energy on any of the other dozens of factors that do play a role in ranking higher.
Martijn.
-
RE: Content update on 24hr schedule
1. No, not really. It mostly depends on the percentage of content that isn't yours and can be viewed somewhere else. If reviews are 90% of the page and they're original content from another site that won't work in your favor though. But in this case, I'm assuming you're working around that.
2. No.
3. I would say No.
4. It depends, as long as you're not creating duplicate content at scale you should be fine.
-
RE: Should I add my html sitemap to Robots?
No, it won't help you at all as it's not a valid extension that they will use. What you can do is add a link to the HTML sitemap from multiple pages on your site so you provide an efficient way for Google to access it and use it to crawl the other pages on your site.
-
RE: Fetch as Google temporarily lifting a penalty?
Ok, that still doesn't mean that they're not personalized. But I'll skip on that part for now.
In the end, the changes that you're seeing aren't triggered by what you're doing with Fetch as Google. I'll leave it up to some others to see if they'll shine a light on the situation. -
RE: Fetch as Google temporarily lifting a penalty?
Hi,
I'm afraid I have to help you with this dream, there is no connection whatsoever between the rankings and the feature to Fetch as Google within Google Search Console. What likely is happening is that you're already getting personalized results and within a certain timeframe, the ads won't be shown as the results will be different as Google thinks that you've already seen the first results on the page the first time that you Googled this.
Fetch as Google doesn't provide any signal to the regular ranking engines to say: "Hey, we've fetched something new and now it's going to make an impact on this". Definitely not at the speed that you're describing (within seconds).
Martijn.
-
RE: Using 410 To Remove URLs Starting With Same Word
Hi,
Have you also excluded these pages from the robots.txt file so you can make sure that they're also not being crawled?
The code for the redirect looks something like this:RewriteEngine on
RewriteRule ^/mono* - [G,NC]Martijn.
-
RE: Is using REACT SEO friendly?
Hi Martin,
It can be, that's the actual answer. As React is using JavaScript to load its pages and load the content in most cases. Google and other search engines are able to read the content but it's always required in these cases to check what the actual result is. I've worked with many sites using React and it depends if they're using server or client-side rendering. Start there, to figure out what you can be using for your client/company. Some teams are really drawn to the client side rendering which is a little bit more dangerous as not always can Google see the actual content. In case of server-side rendering, I've seen it go well for most of these.
Let me know if you have any specific questions, happy to answer them!
Martijn.
-
RE: How much keyword difficulty score i accept when website is new?
Not any number or it depends. Your website is new but it doesn't mean it can't rank for any keyword at the moment. I wouldn't go after the most popular and with that likely competitive keywords. But it's good to keep in mind that you're still able to go after some low hanging fruit potentially in your industry.
In the end, every number that you pick is going to be subjective. So you need to make sure that you evaluate the opportunity the right way.
-
RE: 406 Errors from Third-Parties websites In Google Webmaster Tools
You don't, I would ignore it as it's not really something that could actively hurt your site. It's not great that it's happening and I would double check with the sources if their content is still able to be seen by one of the user agents from Google but besides that, it's likely fine.
-
RE: Https problem on google result.
Hi,
The page seems to work fine with SSL at the moment for me in Google Chrome. I checked with some other tools and also there it seems to recognize the SSL certificate. Make sure that the result for this page isn't cached in Google Chrome, this sometimes happens.
Martijn.
-
RE: Remove Product & Category from URLS in Wordpress
Hi,
You should be able to just change this in the permalinks. It's good to be reminded of some of the things that you need to take care of: redirects, making sure that all old links are pointing the new right way. But most of all, I want to make sure that you're aware of the different internal structures in URL depth. That's probably why most people would advise you not to go this way.
Martijn.