Are those main category pages (like /collections/living-room-furniture) or are they different?
Posts made by Tom-Anthony
-
RE: How to deal with filter pages - Shopify
-
RE: Google Images search traffic and image thumbnail issues
Hi,
I noticed you have your own CDN like domain on dq1.me which has multiple sub-domains. On some of the articles I looked at there were images being used on the page being pulled in from various sub-domains of the dq1.me domain. Furthermore, some of your tags such as
og:imagewere using the same images but on the primary www.careeraddict.com domain.I'd suggest you try to sort this confusion out, and have a canonical image URL that is used everywhere across the site. I imagine there are many confusing signals for Google with there being 2-4 URLs per image which are potentially all being used in different places.
Good luck!
-Tom
-
RE: Sitemap issue
seoelevated is correct.
Reference example from Google here: https://support.google.com/webmasters/answer/189077?hl=en
-
RE: Trailing slash redirects not working in https - working in http: What might be the culprit?
Hi,
What happens when using HTTP? Are you able to share the .htaccess code?
The same .htaccess setup should work for for both protocols, but with .htaccess the devil can be in the details sometimes. A good tool for testing it is here: https://htaccess.madewithlove.be/
Do other redirects from HTTP to HTTPS work ok? I'm wondering whether your Apache setup perhaps has a different virtual host setup for HTTP and thus a different .htaccess file.
Best,
Tom
-
RE: Are Expires Headers Detrimental to SEO Health?
Hi Dana,
Expires headers and other caching headers can help improve site performance (as you said), and that will be a good thing for SEO. There is no reason to be concerned - they are common headers and there isn't much they could do to have any negative impact on SEO.
Good luck!
Tom
-
RE: Robots.txt in subfolders and hreflang issues
Hi there!
Ok, it is difficult to know all the ins and outs without looking at the site, but the immediate issue is that your robots.txt setup is incorrect. robots.txt files should be one per subdomain, and cannot exist inside sub-folders:
A **
robots.txt**file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlersFrom Google's page here: https://support.google.com/webmasters/answer/6062608?hl=en
You shouldn't be blocking Google from either site, and attempting to do so may be the problem with why your hreflang directives are not being detected. You should move to having a single robots.txt file located at https://www.clientname.com/robots.txt, with a link to a single sitemap index file. That sitemap index file should then link to each of your two UK & US sitemap files.
You should ensure you have hreflang directives for every page. Hopefully after these changes you will see things start to get better. Good luck!
-
RE: 6 .htaccess Rewrites: Remove index.html, Remove .html, Force non-www, Force Trailing Slash
Hey NeatIT!
I see you have a working solution there. Did you have a specific question about the setup?
I did notice that your setup cane sometimes result in chaining 301 redirects, which is one area for possible improvement.
Let me know how we can help!

-
RE: How do we build rank on a domain with an offsite blog?
If you configure the reverse proxy (and redirect traffic going direct to the true hosting [x domain] to the sub-folder on the [y domain]) then it will absolutely be counted as part of the main domain as if it was really just in the sub-folder.

-
RE: Htaccess redirects
If I have understood it correctly, you want:
- To redirect certain category pages to new category pages on a new domain.
- Redirect all other pages to the homepage on a new domain
- Not to redirect the homepage and for that to stay on the current domain.
If so I think this should probably do what you want:
RewriteRule ^$ - [L]
RewriteRule categoryA/ http://www.newdomain.com/newcategoryA/ [R=301,L]
RewriteRule (.+) http://www.newdomain.com/ [R=301,L]The first rule says "if this is the homepage stop evaluating rules". The second maps a category page over to the new version and stops evaluating rules; you'll need to duplicate this one for all the specific categories. The last rule is your catch-all rule for redirecting everything else to the homepage of the new domain.
If you want to easily test/check then I highly recommend this .htaccess checker tool.
Good luck!

-
RE: What can I do to rank higher than low-quality low-content sites?
If you have many URLs from the old site in the index that are all in the same directory (or a handful of directories) you can quickly and easily remove whole directories of URLs from the index via Google Search Console. We have found it to work very quickly.
-
Go into Search Console, selected ‘Remove URLs’ under ‘Google Index’ in the left hand menu.
-
Add the page or folder you want to remove, and click next. If you add the homepage, that's the same as all pages on the site. If you add a folder you'll get three options under the ‘Reason’ drop down.
One of those options is ‘Remove directory’. Select that.

-
-
RE: Cloudflare - Should I be concerned about false positives and bad neighbourhood IP problems
Hi,
-
I have used CloudFlare for a few sites and never had an issue with this. It is a risk/concern with all shared hosting, but CloudFlare are very proactive about addressing anything impacting their customers, so I would not have a concern on this side of things at all.
-
Again, I wouldn't have concerns here. CloudFlare are very adept at handling large-scale DDOS attacks . Having read some of their post-attack analysis reports, they usually mitigate any impact to customers very quickly. They have loads of customers, and if this sort of thing was an issue I think we'd hear about it fairly often.
-
I can't speak to the % of users that might get falsely identified as a risk and presented a CAPTCHA, but I'd be very surprised if it was as high as 1-2%; I've rarely seen that CAPTCHA screen myself. You should check what CloudFlare have to say on this issue, but I would have no concern here either.
I have never had an issue with CloudFlare impacting SEO performance or impacting the user experience. It has generally performed well for me, but the biggest issue I see with it is people hoping it is a 'cure all' and means they don't need to properly address issues affecting the performance of their site. If your database performance is very poor, meaning dynamic pages take a long time to load, then CloudFlare is not the answer (it may help - but you should address the underlying issue).
I am unsure about the issue with CloudFlare failing when your server is slow - I'd imagine CloudFlare support could help you with this - there may be a configuration option somewhere.
Overall - my suggestion would be that you go for it.

-
-
RE: Lazy Loading of products on an E-Commerce Website - Options Needed
Ok, cool. To reiterate - with escaped_fragment you are just serving the same content in a tweaked format and Google recommend it rather than frown upon it. Good to be sure though.
See you at SearchLove!

-
RE: Lazy Loading of products on an E-Commerce Website - Options Needed
Hi,
I am not sure I follow your concerns around serving an alternative version of the page to search engines - is that concern based on concerns it will be frowned upon or technical concerns?
Using the escaped_fragment methodology would work for your purposes, and would be the best approach. If you have technical concerns around creating the HTML snapshots you could look at a service such as https://prerender.io/ which helps manage this process.
If that doesn't answer your question, please give more information so we can understand more specifically where you concerns are.

-
RE: Why isn't our new site being indexed?
Hey Dirk,
No worries - I visited the question first time today and considered it unanswered as the site is perfectly accessible in California. I like to confirm what Search Console says as that is 'straight from the horses mouth'.
Thanks for confirming that the IP redirect has changed, that is interesting. It is impossible for us to know when that happened - I would have expected thing to get indexed quite fast when it changed.
With the extra info I'm happy to mark this as answered, but would be good to hear from the OP.
Best,
-Tom
-
RE: Why isn't our new site being indexed?
I am in California right now, and can access the website just fine, which is why I didn't mark the question as answered - I don't think we have enough info yet. I think the 'fetch as googlebot' will help us resolve that.
You are correct that if there is no robots.txt then Google assumes the site is open, but my concern is that the developers on the team say that there IS a robots.txt file there and it has some contents. I have, on at least two occasions, come across a team that was serving a robots.txt that was only accessible to search bots (once they were doing that 'for security', another time because they mis-understood how it worked). That is why I suggested that Search Console is checked to see what shows up for robots.txt.
-
RE: Why isn't our new site being indexed?
I'd be concerned about the 404ing robots.txt file.
You should check in Search Console:
-
What does Search Console show in the robots.txt section?
-
What happens if you fetch a page that is no indexed (e.g. https://www.woofadvisor.com/travel-tips.php) with the 'Fetch as Googlebot' tool?
I checked and do not see any obvious indicators of why the pages are not being indexed - we need more info.
-
-
RE: "Null" appearing as top keyword in "Content Keywords" under Google index in Google Search Console
It seems like the issue is a bug in the way Google handle data from your site ('null' being computer speak for 'empty', and often appearing after buggy handling of data). However, it seems that the indication from Umar is correct, and that this buggy data handling is likely prompted by crawling issue so that is the best place to start.
-
RE: Best way to get the fastest WordPress site with existing template
This is a hard question to answer without knowing the reasons why your site is possibly under performing on page speed tests. There could be a variety of reasons.
I'd recommend a caching plugin, and that your images are all appropriately sized and then, as Stramark suggests, you could look at a CDN to help alleviate the problem.
A new theme/template may help, but is unlikely to be any sort of silver bullet.
-
RE: What would cause the wrong category page to come up?
Is it possible for you to give a clearer description of the categories? You say they are different products but that one is a second category of the other
Does the page you want to rank show up for any other searches? In your analytics are you getting any traffic from Google to that page?
-
RE: Can i get a list of internal and external links from moz?
Hi,
So it sounds like you have ~20 internal links per page, which is not unreasonable and shouldn't be a point of concern.

However, if you are still worried, then I'd worry less about finding a specific list of links to dig into - at this scale that likely won't be very helpful or actionable.
I'd recommend that you reviewed the "Top Pages" report in OSE to see which pages have a large bulk of links. Then I'd do a review of the internal site architecture looking at the major elements (top and side navigation, footer links, any dynamic menu systems) and see whether there are places just to tighten that up a little. You can use what you learnt from the 'top pages' report to help inform this review.
However, as I said - I'd certainly not feel you need to go overboard unless you have additional reasons to think that this might be a problem.
Good luck!