Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Server Connection Error when using Google Speed Test Insight and GTMetrix
Hi Dmytro! Did Thomas' response help answer your question? If so, please mark it as a good answer. If not, please provide us with more details so we can help you sort this out. Thanks.
| Christy-Correll1 -
Spammy 404s: Should I Worry?
The pages definitely don't exist anywhere. Does this mean I have nothing to worry about?
| FSCInteractiveLLC0 -
I have multiple URLs that redirect to the same website. Is this an issue?
I would go further than that and check the link profiles of all the domains. Any sign of spam, unnatural anchor text etc then do not redirect as you'll inject that problem into your site. Even if you believe them to be dormant and never used I would just check in case you were not the original owner. Always worth doing that check.
| MickEdwards0 -
Am I using pagination markups correctly?
Here is the view all page I created. Each of the anchor text links will take you into each paginated page (6 in total). All 6 of these pages have a canonical tag back to view all page. Did I set that up correctly? Thanks for your insight!
| localwork0 -
Deleteing old page and passing on link strenth?
Bob, generally speaking it's best to redirect users to the page that's most appropriate. Think of the users and what page they would land on--and be redirected to. If they were looking for an old product page and just redirected to the home page, would that be frustrating for them? When it comes to an actual search engine ranking lift or boost, you most likely won't see one. You only would just not rank anymore for those products that you used to offer. There still may be a boost if you were to redirect an old product page to another page and there were links pointing to the old product page from external websites. Then, the boost might be passed on to the product page you redirect to.
| GlobeRunner0 -
301 Redirect back to original domain
Angela, we usually recommend looking at each page separately and determining the value of each URL. You may end up with a new site structure, but there may be pages on the site that you need to keep because they shouldn't be redirected to the new page on the new structure. I prefer not to redirect a URL if it doesn't have to be redirected. You will need to redirect all pages from the old site to the new site, and it makes sense to redirect those to the most appropriate page. But, when it comes to a new site structure, I wouldn't necessarily redirect a page if it doesn't have to be redirected.
| GlobeRunner0 -
International Targeting - Google Search Console not recognizing the tags
Hi Aleyda. I just stumbled on this thread because I'm having the exact same problem as Kilgray Marketing was – Google Search Console isn't recognizing the hreflang tags on my client's site: cbisonline.com/eu https://cbisonline.com/eu I realize this thread is closed since you've already answered the original question, but I was hoping you might be able to provide some insight on my situation. Any help would be greatly appreciated! Thanks in advance...
| matt-145670 -
Doubt with geoip redirection
Thanks for the answer Robert. Yes, I´m using hreflang correctly and the serps of the different country versions shows the corresponding country folders. i.e. google.co.uk shows results from the uk folder, google.fr shows the fr folder,… The redirect only occurs on the homepage from mydomain.com to mydomain.com/fr, mydomain.com/uk,….. If a user is in the French folder but is located in London isn’t redirect to the UK folder.
| dMaLasp0 -
404 crawl errors ending with your domain name??
No problem Kerry - just wanted to check you managed to do it and it fixed things? Matt
| Matt-Williamson0 -
Updating Old Content - Should I update In Search Console?
Hi Radi, The time it takes for Google to crawl your website very rarely fluctuates. It will typically take about 2 weeks for a crawl regardless of whether you use the Console to request a recrawl or not. You may save a day or two of waiting with the request but realistically it is probably best to just let Google do its thing and crawl your site when it comes around. It's a much better idea to focus on your content and to make the most of each crawl you make rather than to make small changes and demand a recrawl to determine if it has had any effect. That's a good way to drive yourself insane! Hope this answers your question, Rob
| RobCairns0 -
First Link Priority - Drop Down Menu
Hi Radi, There really isn't any advantage to placing multiple links from your Home Page to a single page on your site. It splits your link juice among those pages and adversely affects their ability to rank. The only benefit might be if you are attempting to generate more leads from a UX perspective and want to create multiple CTA's to your digital marketing page. From an SEO perspective, however, any internal linking should be strictly limited to relevancy and pragmatism - give your visitors what they want in the easiest way you can without spamming your own site. One link from your Home Page to any page on your site is plenty. Cheers and hope this helps! Rob
| RobCairns0 -
404 Errors for Form Generated Pages - No index, no follow or 301 redirect
Hi there I wonder if you would be able to still help. The number of 404's is increasing significantly and the majority only appear in GSC. The reason I think this could be search URL related is these are increasing significantly every day. The robots.txt has blocked some, but as the number continues to increase I am thinking there could be a few reasons, which I need to look into more. A siteliner report cannot crawl the site due to 'too many redirections for this URL'. This is one reason why I suspect there is a wider issue to investigate with the https http. Moz and Screaming Frog are recording some errors (which we expected and need to resolve) but in the 100's, compared to the 1000's recorded in GSC. Any other ideas / suggestions would be appreciated. Many thanks
| Ric_McHale0 -
Crawl rate dropped to zero
Hello, asnwers to the questions bolded: At this rate, how long would it take Google to crawl all of your pages, (maybe it feels 10-15 is fast enough)? Over 50 days, i still cannot believe that it would be just a coincidence that crawl rate dropped so suddenly only because google suddenly thinks that my page should not be crawled that often. After all, amount of new content, quality of new links and all the other factors are much better all the time on my site, and before the drop, crawl rate increased steadily. It has to be some technical issue? Has the average response time increased? If so, maybe Google feels it's overloading the server & backing off. No, it has actually went down a little bit (not much though)
| pok3rplay3r0 -
Can I Block https URLs using Host directive in robots.txt?
Hi Ramendra, To my knowledge, you can only provide directives in the robots.txt file for the domain on which it lives. This goes for both http/https and www/non-www versions of domains. This is why it's important to handle all preferred domain formatting with redirects, that point to your canonicalized version. So if you want http://www to index, all other versions redirect to that. There might be a work around of some sort, but honestly, what I described above with redirection towards preferred versions is the direction you should take. Then you can manage one robots.txt file and your indexing will align with what you want better.
| LoganRay0 -
Spam link? Links from linguee
In the search console I have 900 links from different linguee domains and I not sure if I have to take actions. After reading this post and the previous one I'm not sure what to do. Does anyone has any experience to share about linguee links?
| dMaLasp0 -
One robots.txt file for multiple sites?
Hi Rena. Yes, if both sites are separate domains that you want to use in different ways, then you should place a different robots.txt file in each domain root so that they're accessible at xyz.com/robots.txt and abc.com/robots.txt. Cheers!
| RyanPurkey0 -
Webmaster tools Hentry showing pages that don't exist
Without more information or a site to look through, I did a cursory search of Hentry issues that could be the cause of your problem and the potential fixes. https://www.acceleratormarketing.com/trench-report/google-analytics-errors-and-structured-data/ https://remaintenance.io/blog/2015/05/fix-hentry-errors/
| MikeRoberts0 -
Panda Penalty Recovery?
If it is Panda then cleaning up the site to make it more useful is your best bet. Make sure the design is clean. Make sure the writing is answering the user's question, and that it's well written. Make sure that the function of the page matches the search query. Fixing these things should result in an eventual improvement in rankings. If you were cloaking, that could be a penalty all by itself. Make sure to fix anything shady, and consider whether non-standard implementations or architecture could have confused Googlebot or made it look like you were trying to manipulate rankings. Check Webmaster tools for any warnings or manual penalties for sure. You can also cross-reference drops in traffic to what others have seen on similar dates to make sure it was Panda.
| Carson-Ward0 -
301 Redirect Question
Hi Matt, To answer your question directly, the website provider may or may not handle 301 redirects themselves depending on your agreement with them. If they are responsible for website development and maintenance, then it would likely be their responsibility to ensure the website is running properly. However, the factors behind the use of a redirect mean they may not cover that aspect. Typically, a redirect is used for re-branding purposes or to direct greater link juice throughout your website. In other words, it is an additional service that likely isn't covered in a contract unless that was the specific purpose of hiring them. In terms of choosing between the www. and the naked URL version - anyone conducting a 301 redirect should have have been clear to redirect one way or the other as mixing them up creates separate URL link chains. That being said, authority and relevance can still run through your website since you have on-site factors like Title Tags to take care of this. From what you have provided I can agree with Mike in saying there doesn't seem to be much to worry about here with regards to the redirects set up on your website. Some people like to have consistency across their domain - some like naked URLs and others don't. There are benefits to both decisions, but in terms of what you are asking - the impact will be very small and likely won't hurt you in terms of rankings. Hope this helps and reach out to me if you have any further questions. All the best, Rob
| RobCairns0