Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Fab, thank you both for your thoughts. I was 99% sure I was going to do it, just needed someone else to agree with me

    | RebekahVP
    0

  • Hi Alex, Apologies for taking so long to reply to your thorough answer (I work one day a week for this client). This is very useful and clarifies the procedure we have to go through. I did contact Shopify, it is a great platform and they were quite helpful concerning the apps that the platform uses but not helpful when it came to arranging the redirects. So, thank you again for your reply, it's going to make a difference to our SEO! Jim

    | LucyBee
    0

  • Most site migrations, whether they involve a redesign or not (or whether they move domain or simply alter existing architecture, e.g: HTTP to HTTPS) will incur a small dip in performance, yes Usually when you perform a site migration, it's for strategic and not tactical reasons. You're usually thinking of the long term. Your belief is that in the long term, the new domain and / or design / architecture will perform better than the old one(s) If redirects are not properly handled, you could lose all of your traffic quite easily. If redirects are handled correctly, you're in a much better position but still likely to suffer some small indent in terms of performance (usually not lasting longer than 1-2 months - if you keep producing good content and earning great links) What you have to remember is, if you always play it safe and never 'evolve', you might incur less cuts and bruises now and then - but you will die faster. As others overtake you through their efforts, you sink and fall behind. It's worth striding out there, taking a few nicks and cuts - to preserve your overall life-span for longer (think of it like regular rigorous exercise, it's painful when you do it but later you see the benefit) 301 redirects an translate up to 100% of your SEO authority from one place to another, but they won't always. If there are too many links to redirects that can make them slightly less effective. If redirects begin to chain (redirects to redirects) or if the wrong type of redirect is used, that can drastically affect the transfer and you could see as little as 0% of the prior SEO equity on your new domain. Another thing, if content is relatively different (in machine terms, think Boolean string similarity comparison - NOT "oh yeah as a person it looks similar to me") on the old and new pages, that can directly obstruct 301 redirect SEO authority transfer. Google has chosen to rank X page, if you replace it with Y content then it becomes a risk to Google. If content is mostly new, it mostly has to prove itself again (and redirects become largely nullified) To some extent you can get around this by performing backlink amendments. Getting webmasters to change their links to your site, so that they hit the new domain / architecture and not the old one. This means that the backlinks are not flowing through redirects, and thus Google can have more confidence that the new content is just as good (for similar search terms) as the old content was. If many webmasters disagree to update links for you, that could be a sign that your old content was more useful than your new content (so roll back!) Your new domain, if it hasn't been used before (ever) may be sand-boxed by Google for a few weeks. That can be a normal thing, until Google digests all the redirects, re-linking and your usage of Search Console's change of address tool (which you absolutely should use, but don't mess it up by even one character or you'll cause yourself months-long headaches) Sometimes if everything goes swimmingly, you can get very lucky and not even see a dip at all. That's not the norm, so don't set all your expectations around that

    | effectdigital
    1

  • I would definitely not allow search engines to index those type of results pages. To be fair, they're unlikely to come across them as a bot wouldn't typically fill in a search box to search, but ey might follow a link from somewhere else. For products, I would definitely want to be using category (or similar) pages to define what the search engines saw.

    | Xiano
    0

  • Thank you again, Alex. Moz has tagged a bunch of these pages as "temporary redirects" so I have them all as "disallow" right now. I'm hoping that will fix the issue. I'm not sure why Moz is flagging them as temporary redirects. They are just review pages of my products, which I guess are generated when a customer clicks the Leave a Review button and then gets taken to these review pages.

    | AllChargedUp
    0

  • Google will often make up its own mind what to put in your description, especially if it thinks that your provided description doesn't match what it thinks the page is about. I'm afraid your developer hasn't fixed the issue. I was looking in the wrong place (your screenshot identified the issue), the malware isn't replacing the meta description, but actually inserting text at the top of the page, but only when it detects that it is the Mobile Googlebot visiting. Using Google Chrome Developer Tools you can set your user agent and see the issue yourself (See my screenshot) If I were you, I'd disable all the plugins and then reload the page in Chrome with your user agent set and see if that helps, if not, I would look at your theme's JS/source files. oBUJqpm

    | Xiano
    0

  • The canonical tag is meant to tell Google....   "These two pages are the same" It was not meant to prioritize pages in the rankings. Because your pages are very different from one another Google is probably ignoring the canonical tag. "should we just appreciate the fact that Google ranks us twice on the first page for this important keyword? " Yes.  You got two pages in the top ten of Google.  When that happens to me, I give thanks.

    | EGOL
    0

  • The true answer to reinstating the old site depends upon the quality of work done on the many parts of the job by the PR agency.   If this was my site, and it was performing kickass before these PR guys got their hands on it, I would toss the old version back up and do a deeper assessment - because the results thus far suggest that the PR agency doesn't know much about SEO. <kibitz>In the past few years a lot of PR agencies have become spammers who wear suits to work.</kibitz>

    | EGOL
    0

  • Thanks for your reply Nick! If you type "queen elizabeth stakes" into Google you are likely to see articles from Just Horse Racing. https://www.justhorseracing.com.au/fields-results/race-fields/nominations/queen-elizabeth-stakes-winx-nominations-2019/513812 https://www.justhorseracing.com.au/fields-results/race-fields/queen-elizabeth-stakes-field-winx-2019/514064 https://www.justhorseracing.com.au/news/australian-racing/maximum-eight-rivals-for-winx-in-qe/514125 https://www.justhorseracing.com.au/news/australian-racing/japanese-raider-to-take-on-winx-in-queen-elizabeth-stakes/514120 And from Punters: https://www.punters.com.au/news/winx-to-face-eight-rivals_179095/ https://www.punters.com.au/news/winx-1.06-for-queen-elizabeth_179099/ Whereas our article, https://www.racenet.com.au/news/eight-challengers-for-the-winx-swansong-in-the-queen-elizabeth-stakes-20190409 does not appear. Or if you search "sydney cup", Just Horse Racing and Punters both feature in Top Stories: https://www.punters.com.au/news/full-field-for-sydney-cup_179100/ https://www.justhorseracing.com.au/news/australian-racing/sydney-cup-favourite-dubhe-draws-gate-four/514145 Our article https://www.racenet.com.au/news/sydney-cup-field-and-barrier-draw-20190409 doesn't I greatly appreciate any help Nick!

    | Saba.Elahi.M.
    0

  • Lots of people facing this same problem. It happening because Google has some technical issues.

    | jacobmartinnn
    0

  • Hi Hecksler, Did you ever resolve this? Quick idea from me is to double check ALL version of your website within Google Search Console. You can now register the entire domain property using DNS: https://searchengineland.com/how-to-set-up-google-search-console-domain-verification-for-site-wide-reporting-data-313256 I found that Google was trying to crawl a very old HTTP sitemap from about five years ago for one of my sites, and thus I was able to delete it. There's some mixed comments/feeling within the Search Community about whether or not GoogleBot really "guesses" URLs, so it's probably more than likely they are getting the links from somewhere....https://stackoverflow.com/questions/20855082/googlebot-guesses-urls-how-to-avoid-handle-this-crawling Look forward to hearing from you, Nick

    | NickSamuel
    0

  • Hello SEOhmygod, If you could share your site I would be able to provide better input. From the sound of it though, if I understand correctly, you are canonicalizing each department back to a version of the page for all departments. That is probably not ideal. It would be better to have each department have its own landing page for each category, on which you would customize Title, description and OnPage content for that particular department. You may also want to consider rewriting the URL instead of using a filter so: .com/tyres?department=1 becomes .com/tyres/mountain-biking/ or .com/mountain-biking/tyres

    | Everett
    0

  • Hi Dilip, You have a chain because your original site redirects from http to https, before the redirection kicks in to take the user to the new site. It could be that you have two separate systems creating redirections, I suspect one that is handling your http -> https that is set up on the web server software, and then further ones in your CMS or htaccess. I wouldn't worry too much, to be honest, a chain of that nature isn't a big deal. If you are concerned, you just need to ensure that the specific redirects are processed first and are configured so that the source is relative, redirecting to an absolute path.

    | Xiano
    0

  • Wildcard will work, that means that "http://old_url/anything_here" will redirect to "http://new_url/anything_here". However, it looks like you can set up individual redirects using the box next to the slash, and the box underneath labelled "Redirects to". For example: /contact.asp Redirects to: https://bimcosupply.com/contact/ I note you say you have changed it to redirect to https now, but I am still seeing a redirect to http. I also see that you are using CloudFlare. If you haven't already, I would suggest turning off caching for the moment, as this will make testing more difficult (Otherwise, after each change, make sure you flush the cache for both domains) At the moment I am still seeing the same behaviour resulting in all of your redirects going to the home page. Given that other random urls give a 404 (https://bimcosupply.com/sdfdsfs -> 404), I think your individual redirects must be broken. Which plugin are you using? I'm a fan of Redirection by John Godley https://en-gb.wordpress.org/plugins/redirection/ Regarding the toggle switches, I would imagine they should be turned off. Your site is currently indexable, but that could be caching perhaps.

    | Xiano
    0

  • Yes, I tried the old Search Console option before I posted in here but sadly, it just redirects you back to the new version. However, I didn't even think about the redirect opportunity and considering the website is built on Wordpress, that should be easy enough to set up. Thanks so much!

    | MainstreamMktg
    0

  • Depending upon how you've set up your previous redirects, it could be a problem, but I suspect you'll be fine. In my experience, a small redirect chain won't cause a massive issue and unless you already have some multi-step redirects going on, your new plan won't put you at risk of more than two steps anyway. Obviously, a loop would cause serious issues, but I don't think you are at risk of this. Your redirects should be relative to absolute, i.e. /oldpage to https://siteImanage.com/newpage. In this case, you would just need to update your absolute paths to the new url. If you ensure these are matched first, then you would only have one redirect regardless of whether the client hits the www or non-www domain. If the full domain redirect is matched first, you'll end up for two steps for those people hitting only pages on the old domain If your redirects are matching based on relative links and redirecting to relative urls (as some plugins do), i.e. /oldpage redirects to /newpage, then you'll end up with a two-step process if someone follows an old link; First, the site will redirect from non-www to www, and then it will redirect to the correct page. If you have redirects such as "https://siteImanage.com/oldpage" redirecting to "https://siteImanage.com/newpage" then you would create a two-step process again, first to the new page and then to the new domain. Of course, your redirects wouldn't work on the new domain as they wouldn't match, which may or may not be a problem for you.

    | Xiano
    0

  • Hi Kay, These two example URLs do not seem to represent duplicate content - the listings are different for each page. I wouldn't want to set canonical tags on these pages as they are unique. Did you see an error in an SEO tool of some kind indicating there was a duplicate content problem here? For general applications, canonical tags should be added to the duplicate page(s) themselves, and those canonical tags should refer to the preferred/canonical page that you want to keep indexed (but no changes need to be made to the preferred/canonical page itself). So if these pages were true duplicates of your homepage, each duplicate would need the canonical tag  (but the homepage itself would not need to include the canonical tag). From your examples, I'd just caution you may not be dealing with actual duplicate content - they look like valid pages to me. Best, Mike

    | MikeTek
    0

  • Hi there, As long as the wordpress settings result in serving the new content at the root (www.example.com/ rather than www.example.com/newhomepage) then you will not risk any page authority with this change. The only change that would be visible to google in that situation would be the updated content. Hope that helps.

    | willcritchlow
    0

  • It appears the site has been impacted up by the medic update on 5 August 2018.  Review data and check when the decline spiral started. In answer to your query Schema has a massive impact on some sites, not sure where your site fits into the schema discussion - but properly executed in the correct circumstance schema is only beneficial. It would have to be a technical audit on why the wrong pricing.. as a .com.au with an Australian based platform.. should be coming up in AUD..

    | ClaytonJ
    0