Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • You'll need to inspect the site's server logs to begin narrowing this down, Jens. Causes can range from traffic overload (like from bots, which won't show in your Analytics)  to database problems, plugin conflicts etc.

    | ThompsonPaul
    0

  • The answer probably revolves around your 301 redirects. I assume that you didn't keep the old URL structure. Was your site https before the change? Was it www? Did you have a trailing slash in the old URL? Did you redirect all pages with backlinks? You really should 301 everything. Did you change the links in the navigation? Are you getting any alerts in search console? Did the canonicals change? Did the category filters change? The simple answer is "find what changed". The complex answer is "find what changed". Do you have a copy of the old site? Run an audit on both. Sometimes you can also glean some insight from Google webmaster tools Pick up a copy of "ecommerce seo" by Traian Neacsu. He doesn't have an audit checklist per se, but anything you could be doing wrong is in the book.

    | Satans_Apprentice
    0

  • I can't answer your question, but a word of warning: If you're moving a large site to https://, your rankings will drop for 2-8 weeks depending on the size of your site. We did this for an ecommerce site with 60,000 SKUs, and our rankings tanked for 6 weeks. It's "normal", but Google won't tell you how badly your site will get slammed. Be prepared. A few very important suggestions: Make sure that you set up http:// and https:// properties in Google Search Console. Create a thorough and complete .xml sitemap(s) using the http:// prefix. Submit the sitemap to the https:// GSC property. This will help Google find and map the redirects more quickly. Keep an eye on search console for the count of http and https pages indexed.  Ideally, the http count should go to zero. Make sure your redirects are set up correctly in your .htaccess file (Apache). There are several sites that will do 301 traces (google "301 tracer"). Keep the redirects at 2 hops or less. Make sure that redirects are 301. 302 is the .htaccess default, so make sure you call out "301". There are several redirects that need to occur: http to https non-www to www or vice-versa URLs with a trailing slash to without a trailing slash or vice-versa. When you make the cutover, test all combinations of the above redirects - there are 6 different combinations of the above bullets to test. (http with and without a trailing slash are 2, and with and without www for example)

    | Satans_Apprentice
    0

  • That's weird? Unless they were already in the process of deindexing. Best practice is to set the parameter Regards

    | Nigel_Carr
    0

  • Hi Nigel Thanks for your advice and confirming my concerns. Regards Gavin

    | Gavsta
    0

  • Hi, That is a confusing problem, but it seems like most of the page content is coming from the Iframe and not the page itself. If you can I would use "No Follow" and "NoIndex" tags on the Iframe and see if that helps. If possible I would build out the content on the correct page as much as possible without affecting its usability too much. Also, you can try "rel canonical" to the correct page and hopefully that will help.

    | Dalessi
    0

  • If you're considering other hosting providers to speed things up, look into duda.co. They have a great thing going from everything I've researched.

    | Gabe_BlueGuru
    0

  • How can you submit them to Search console if they don't live on your root domain? I understand that you can reference the cloud sitemap URL it in the robots.txt but without it being in Search console you lose visibility to errors and indexing issues.

    | RuthHearn
    0

  • Hi Jorge, Backing up Gaston here, submitting that to the 'Remove URLs' tool in Google Search Console should do the trick and the Noindex tag will keep it out of the index in the future. It is possible this page is still in the index because Google has not recrawled it since you added the tag. Have you checked the latest cache? You can do so using the "cache:" (without quotes) search command infront of the URL you want to check (example). If you ever do need Google to recrawl (or index) a page, you can request they recrawl your URL using the 'Fetch as Google Tool' in Search Console. Hope this helps!

    | Joe_Stoffel
    0

  • Hello Ran, Just to clarify, in search console, when you go to Crawl-> Robots.txt Tester and in the middle right clicking en el see live robots.txt, it doesnt show the correct file? It could be that google isnt recrawling the new robots.txt

    | GastonRiera
    0

  • Hi Moz team, Till date google showing 404 for sitelink search box query even after implementing j-son. The search page which google showing that not exists anywhere on my site still google showing. So my query is 1) Is it google taking time to update the search action? 2) Instead of website search if I want to google search in sitelink search box then what I have to do? Thanks!

    | amu123
    0

  • Hi Arun It's certainly best practice to move to a root directory. As you say visitors are then coming to one domain, not a subdomain. All you need to do is page by page redirect through a 301. When you say they are being 'redirected to 301 code' this is perfectly OK. The 301 code just tells Google that the page has moved permanently. It takes Google a short while to recognise the new pages as replacing the old ones and for that period you can see old and new in Google, causing a short period of duplication which could affect the rankings. You just need to sit it out - by all means, do a Fetch in Search Console to help speed up the process. Search Console>Crawl>Fetch as Google. Regards Nigel

    | Nigel_Carr
    0

  • Your answer is confusing sorry. If you're not supposed to add the old site's verification to your new site, how does anyone complete the Name Change Tool? Step 2 of the tool requires 301 redirects be in place before you can move on Step 3 of the tool requires you verify the old site Obviously, if step 2 is working, then step 3 will always fail. How do you complete the name change tool?

    | cmscss
    0

  • Yes, organic conversions and then organic traffic are where I would focus. Rankings should be monitored but don't obsess over them. Typically origin of backlinks matters. Pick 5 of your top competitors and see which country their backlink profiles are from primarily and emulate that. The perfect backlink profile is always based on the niche and tends to vary. I will look out for the other questions.

    | John.Moz.com
    0

  • Wow - sorry this question slipped past, Ruth. As long as the proper HTTP redirect has been written to HTTPS, there's nothing that needs to be done with canonical tags. The beauty of 301-redirects is that they are server directives - once in place, its no longer even possible to reach the non-HTTPS URLs. The HTTPS URLs should of course still keep their own self-referential canonical tags, but that's handled automatically in most CMSs (Content Management Systems like WordPress.) Hope that covers what you were asking? Paul

    | ThompsonPaul
    1

  • Hi fogtheagency www.example.com and blog.example.com are two separate domains and you shouldn't have a common sitemap for two different sites. The best way of setting up your blog is to have the whole installation in a directory of your main site. If it is Wordpress it's very easy to FTP save the files&directories, export WP files & Database and re-upload to www.example.com/blog This is a much better way of keeping all of the link juice on the one domain and your blog as a sub-directory of the main website will be automatically included in the same sitemap as your main website. Regards Nigel

    | Nigel_Carr
    0