Hi,
Are you using any kind of filters in Analytics, for example, exclude bots, exclude queries and pages anything like this?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi,
Are you using any kind of filters in Analytics, for example, exclude bots, exclude queries and pages anything like this?
Well, you can't 100% rely on third-party tools anyways they might miss it sometimes especially Moz, they are still rookie in this. The other day, I got an email from Moz saying that we couldn't crawl one of your site "Warning" Red alert etc, Robots.txt isn't there. I doubled check it manually everything was smooth and then I made Moz re-crawl the site to check if everything is okay so yes it was.
Moz needs to work hard to improve their tool and it's function it doesn't give you the proff look and feel.
Hi,
Moz tool can be creepy at times, what you can do is that manually check your page If it' has h1 tags are there or not from Inspect element or View Source code. Plus try to use any of these tools like Google tools LightHouse, PagespeedInsights and Mobile FriendlyTest not sure if these tool would give you the answer but then you will see the most of the issues. Try this tool seositecheckup.com and you can also check by fetching a page with rendering HTML to see how does Google reads it or sees it.
Agreed, with the comment above. DA is not a Google metric neither Google has endoresed it by any mean. SSL is more of a secured page which I agree in the future they might have some major role to play. Getting a link from secured or non-secured doesn't really make a huge difference, what makes difference is that the site you are getting a link from needs to be a quality site, in terms of UI, UX, Content, Technical aspects and of course linking strategy. A poor site can be secured but then you won't be getting any benefit out of it.
Hi,
I had been in a similar position, dealt with 10+ websites with the same issues, but now fixed. The nature of the business were the same, services the same just the markets were different like .in, co.uk, .ca, .us etc. We were skeptical initially as we had a plan to set up a blog or something on all the local sites or not, but then the question arise duplication, conflict etc.
So what I did, I went with seperate blogs for few of the top performing sites. Lets say, co.uk and .us, the services were the same and so does the keywords. Since we are using hreflang, ccTLDs, International targeting enough singals for Google to understand which page belongs to which country. We started rephrasing the master content kind of more LSI approach and published them under co.uk since that was 2nd best perfoming markets we had first one was US. In this way, we didn't consfuse Google and our users yet we were writing some quality content and rephrasing them with LSI approach to publish them on other sites.
But initially, Google was showing the US content in UK and UK content in US, but it got fixed once we had enough content on the site.
Plus moving to subdomain just to make it central no I won't suggest you to do this. What should you do, It's time consuming but long lasting, produce the content, rephrase with the LSI approach, add or less the content based on country requirement or whatever you feel like doing and then tag content / URLs with self-referential hreflang and canonical tags
Hi,
I just got a client who has had the same problem as you do, different numbers - drop and goes back up again. I did some analysis as he also had taken some SEO services. Well, I wouldn't blame the backlinks that he had but yes, it was excessive backlinks produced during the Dec 2017 - Mar 2018. Which I believe wasn't required at all. Did some fixing with backlinks and now he is back up again.
When it comes to your case, Do not rely 100% on third-party tools like SEMrush they can be right up to some extent but not 100%. Second, season or competition could be another thing that may have pushed you down during a certain period of time.
Third, what does GSC says, High impression or Clicks? It's okay to have a higher impression but If clicks are somewhere around the same in comparison to organic traffic that it's okay as you do not get the 100% exactly the same data both in GSC and GA, numbers may differ up to 5%-10%. Also focus more on High Impression VS. CTR. Do some comparsion what was working in Feb or June or anyother high months and what's not working this month. Which queries dropped in CTR or Impression etc. Do some analysis.
Last, we all know there was an update in August that has impacted most of the websites and experienced the dropped in traffic and such small updates are still impacting the sites.
My suggestion: Do an audit, go through the backlinks, check your keywords, see which pages were working good? What keywords went down? What could be the reason competition or do you need to change ON page.
Hi Tanya,
It happens and there could be many factors like indexing, caching, Search ignores the preffered title , client using private search results as in logged in. I'm working on various sites that goes through the same issue every now and then, but It takes time to see the correct version once the cache is updated.
I would suggest you to wait and see and don't make too many changes in the title. You can show the client the screenshot or you can use this https://www.google.com/?gl=us&hl=en&pws=0&gws_rd=cr from UK and see if you see the same thing or not.
Thanks for your help. Do you suggest using hreflang in a Sitemap or in a header? What's the best pracitce?
Hi everyone,
We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Hi,
Moz tool can be creepy at times, what you can do is that manually check your page If it' has h1 tags are there or not from Inspect element or View Source code. Plus try to use any of these tools like Google tools LightHouse, PagespeedInsights and Mobile FriendlyTest not sure if these tool would give you the answer but then you will see the most of the issues. Try this tool seositecheckup.com and you can also check by fetching a page with rendering HTML to see how does Google reads it or sees it.
Hi Tanya,
It happens and there could be many factors like indexing, caching, Search ignores the preffered title , client using private search results as in logged in. I'm working on various sites that goes through the same issue every now and then, but It takes time to see the correct version once the cache is updated.
I would suggest you to wait and see and don't make too many changes in the title. You can show the client the screenshot or you can use this https://www.google.com/?gl=us&hl=en&pws=0&gws_rd=cr from UK and see if you see the same thing or not.
Agreed, with the comment above. DA is not a Google metric neither Google has endoresed it by any mean. SSL is more of a secured page which I agree in the future they might have some major role to play. Getting a link from secured or non-secured doesn't really make a huge difference, what makes difference is that the site you are getting a link from needs to be a quality site, in terms of UI, UX, Content, Technical aspects and of course linking strategy. A poor site can be secured but then you won't be getting any benefit out of it.
Hi,
I had been in a similar position, dealt with 10+ websites with the same issues, but now fixed. The nature of the business were the same, services the same just the markets were different like .in, co.uk, .ca, .us etc. We were skeptical initially as we had a plan to set up a blog or something on all the local sites or not, but then the question arise duplication, conflict etc.
So what I did, I went with seperate blogs for few of the top performing sites. Lets say, co.uk and .us, the services were the same and so does the keywords. Since we are using hreflang, ccTLDs, International targeting enough singals for Google to understand which page belongs to which country. We started rephrasing the master content kind of more LSI approach and published them under co.uk since that was 2nd best perfoming markets we had first one was US. In this way, we didn't consfuse Google and our users yet we were writing some quality content and rephrasing them with LSI approach to publish them on other sites.
But initially, Google was showing the US content in UK and UK content in US, but it got fixed once we had enough content on the site.
Plus moving to subdomain just to make it central no I won't suggest you to do this. What should you do, It's time consuming but long lasting, produce the content, rephrase with the LSI approach, add or less the content based on country requirement or whatever you feel like doing and then tag content / URLs with self-referential hreflang and canonical tags