Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • One alternative to the "noindex" directives described below is to instead implement a canonical link tag to an alternative page you would prefer to be indexed for the same content. This would only be relevant if there was another very similar page. And, it would not be as strong of a directive as the "noindex", so I would not use canonical if it's extremely important that it not be indexed, like perhaps some sensitive info. If you do go the NOINDEX route, then I would suggest, after you verify it has been properly implemented, to use the removal tool in Google Search Console. It technically isn't needed, if you have the NOINDEX directive, but it is usually recognized more quickly, and won't conflict. And then, once the page is out of Google's index, the next step (and only after it is removed and has the noindex tag), is to disallow crawling of that page in your robots.txt file.

    | seoelevated
    0

  • Hi there! You can read about how to calculate Domain Authority here: https://moz.com/learn/seo/domain-authority In terms of improving your Domain Authority, it's best to look at both you on-page SEO to make sure you've covered the basics. Then you also want to look at your off-site SEO and building more links more links to your site. There are lots of different areas of Link Building to explore, from patching up broken links, improving your internal link profile, and good ol' fashioned content creation. Here is a great video by our founder Rand which goes over some easy link building tactics to get you started. I hope that helps to start you off! Let me know if you have any other questions :]

    | Natalie-Alexis
    0

  • Hi Chris, thank you very much for your help and suggestions, it is much appreciated.  I'll de-noindex a handful of my biggest artist pages and see if they attract much interest from users. As for the /venues/ pages, these have been fairly neglected to date, so perhaps I need to really focus my attention on them, as you say, and bring in some cross referencing. I have also wondered whether allowing companies to create pages dedicated to their events would be a good route to take - it could be done with ease, so perhaps I should investigate further. Again, thanks very much, and hopefully I can report back with good news at some point. Best wishes Mike

    | mjk26
    0

  • Hi ali I have the same problem This is my site https://iranirent.ir/

    | digitalseodm
    0

  • Right but most pages of my site have been indexed. They're not ranking well yet and that's what takes time. The indexing happens quickly, it's the ranking that takes time. I appreciate your responses, though.

    | TexasBlogger
    0

  • De-indexed would mean it is removed from google search completely. Your having a 'temporarily' being shown and not shown into search. This usually is the result of a bad backlink profile happening, so i'd suggest to check the most recent placed links, and adjust accordingly.

    | Vanderlindemedia
    0

  • Thank you guys, for your help it helped my problem to slove. I restore my pages back.

    | roynguyen
    0

  • Hey Mike. Not to fret, this is actually a common silly mistake in theme development. The meta description shouldn't be set in WordPress theme because that is handled in the wp_head function. This function is where Yoast, and other plugins, hook into to deliver all the SEO optimized meta and header tags.

    | Advanced-Air-Ambulance
    1

  • It sounds as though you should be OK in that case - if they are all site.com/post, then it shouldn't matter how many categories they are in. In theory you can have Topics and Categories - it all depends on how the site is set up, but I would probably say it's best to focus your efforts on one if I had to guess without knowing the site and all the considerations inside out. Good luck.

    | willcritchlow
    0

  • It seems like Google only has a handful of distinct pages indexed at the moment - whereas bing has about 10x as many. So that seems to indicate something wrong specifically for Google. I'd start by checking your search console - are there any errors? If you use the URL inspection tool and visit the URL in question (studyplaces.com/15-best-minors-for-business-majors/) does it tell you why it has been canonicalised? What happens if you view the page as Google saw it? Is there any chance that you are blocking googlebot / cloaking in any way? Have you had any website outages or downtime? As others noted, your about page is now missing - did you do that deliberately to see if it resolved this issue?

    | willcritchlow
    0

  • Hi  Jeroen, Many websites have category or listings pages that contain substantially different lists of links each time Google crawls them. This can be because they are rotating the top listings (like you describe) or simply because the velocity of content creation (and in some cases archiving / removal) is high enough that it appears to change dramatically (think e.g. the reddit "new" page). As such, I don't think you need to do anything particularly special here - it should "just work" for the page in question - depending on the details, you might want to make sure that there is enough other content on the page that it is substantial enough in its own right. The other thing I'd consider is whether you want to have more static crawl-paths available to make sure that googlebot always has a way of discovering and crawling all listings - whether you do this via categories, tags, or via some other means.

    | willcritchlow
    0

  • You're right that the best plan is likely to look at adding that content onto the mobile versions of the category page (though it's worth rolling out slowly and carefully if you can't split test because we have seen it be good or bad in different circumstances - see this whiteboard Friday for example). In theory, with mobile-first indexing, Google will be crawling your site with a mobile user agent, and so as long as you are treating googlebot the same as you treat other similar user agents, it should see the page exactly as you do when you visit with a mobile browser (or emulate mobile using chrome for example). There are various ways to check different parts of this: Check what is actually indexed - by viewing the cached version of the page and / or searching for unique text that only appears on a specific category page "in quotes" Check what google sees using the URL inspection tool in search console and selecting "view crawled page" Good luck - I hope that helps.

    | willcritchlow
    0

  • Good question! Good question! The way to handle this is by using something called "Hreflang". Here is Google's documentation on it; https://support.google.com/webmasters/answer/189077?hl=en All hreflang is, is some on-page code which says "this is the US version, that is the UK version" etc. Google uses it in exactly this kind of scenario, it determines what pages should appear in what countries and avoids content duplication issues. Don't redirect the pages - that will stop users (or Google) from accessing the international variations so the hreflang won't work. All you need to handle this is hreflang. Hope that helps!

    | R0bin_L0rd
    0

  • The backlink profile is robust (with links from huffpost, edu sites, lifehacker, nbcnews.com, etc.). The site has a DA of 40 (per MOZ) and 1.6k linking domains. I can definitely say that the backlink profile is not a detractor from the site.

    | shags313
    0

  • I have solved this issue by applying CBD Flower. That is why all the solutions are possible for every task.

    | alihassanblogger
    1

  • Yes, that "loading" score is horrible. Just 8 seconds to load the first few batches of threadwork (javascript). Imagine the experience for a mobile user on limited data or so.  Offload as much as possible of scripts; really having a neat website does'nt require so much bells and rings all over the place.

    | Vanderlindemedia
    1

  • It's a myth that your DA drops because you put links in disavow. Disavow is a google only (or bing) tool, where lets say you get spammy links from a rogue domain and there's no way you can get 'm removed. MOZ cant read your disavow file either you file into google. So i'm not sure on how the link is being put here. With MOZ, or any other tool, they just calculate the amount of incoming, FOLLOW links and presume your DA on some magical number. Thats all there is to it. Again, PA/DA has nothing in common at all with Google as Google maintains their own algorithm.

    | Vanderlindemedia
    1

  • Links really need months in time to be really effective. So as long as your losing links are not that much of a value, your good.

    | Vanderlindemedia
    0