Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi Brian, Yoast's SEO plugin will apply the noindex tag to that unique _http://www.example.com/services/_ page URL. Subsequent pages that adopt that page as a parent will not be affected, so you'll be okay there from a technical standpoint. "there doesn't need to be anything on the parent /services/ page itself" Strategically, I think this is a missed opportunity for SEO and lead generation. Creating a top-level Services page is a great way to position your company well in search for search queries such as "home remodeling [city name]," highlight what services you offer and what sets you apart from the competition, support a better internal linking structure, and generate leads (CTAs, embedded forms, etc.). -Brian

    Technical SEO Issues | | brianglassman
    1

  • If this was my site, I would be fighting to keep the breadcrumbs.

    Technical SEO Issues | | EGOL
    0

  • I would probably want to dig a bit deeper on why your team feels there's a need to remove breadcrumbs. From what I can see on the Chico's website, they do indeed use them: https://www.chicos.com/store/category/sweaters/cat8319278/ As well as proper structured markup: https://search.google.com/structured-data/testing-tool/u/0/#url=https%3A%2F%2Fwww.chicos.com%2Fstore%2Fcategory%2Fsweaters%2Fcat8319278%2F To address your question more directly: Properly marked-up breadcrumbs add another layer of meaning and organizational depth to your website, which is parsed by search engines and weighed when organizing SERPs. While I can't assume the direct impact their removal would have on your organic traffic, I would hesitate to "downgrade" your site in terms of semantic markup. This sounds like to me like the issue could be less to do with the breadcrumbs themselves, but instead a styling (breadcrumbs too distracting/prominent), or an architectural issue (breadcrumbs list too long). More helpful information on breadcrumbs and proper markup/structured data: https://audisto.com/insights/guides/2/ Hope this helps. -Brian

    Technical SEO Issues | | brianglassman
    1

  • Hi Patrick, Thanks for the answer. That's where we are confused. We been giving the suffix as "brand & primary keyword" like how most of the websites do. Example is "Page topic xxxx | vertigo tiles" where "vertigo" is brand and "tiles" is our primary keyword to rank for. While choosing the same suffix for all pages, "primary keyword" is at the end of every page title. I think how it will work if we bring "primary keyword" along with topic and give just "brand name" and suffix. Like.... "black colour tiles | vertigo". I wonder which suffix will make more impact at Google. General public search with our "brand name" and only 10% search with our "brand name and primary keyword". So do we need to change the suffix to just "brand name"? Will it helps?

    White Hat / Black Hat SEO | | vtmoz
    0

  • Hi there If you have ScreamingFrog you can run a crawl on the websites in question to see what they have. Although that could take some time if you do a site one by one. Otherwise, you can do a manual "site:example.com inurl:news", "site:example.com inurl:blog", "site:example.com blog", or "site:example.com news". Either way, this will take some time as I'm not sure there is a tool specifically for this type of task. Might I suggest using tools like Majestic, BuzzSumo, or SEMRush to find a website's top ranked or linked to pages. You can quickly filter through to blog or news articles and see what content is working best for particular sections of sites. From there you can prioritize your efforts and work accordingly. Hope this helps a bit! Good luck! Patrick

    Intermediate & Advanced SEO | | PatrickDelehanty
    0

  • Thanks, I will get back to you if I face an issue.

    Social Media | | DebashishB
    0

  • Hi Patrick, We actually created same content on duplicate domain before website migration to monitor how new website works. But the duplicate domain was allowed to index by mistake even after actual website launch. Then we have noindex,nofollow,noarchive,noodp later. But still I can see cached version of the duplicate content in Google. This old indexed duplicate content still hurts us? Thanks

    Search Engine Trends | | vtmoz
    0

  • Thanks for clarifying. In terms of weight, I don't think there is any significant difference between your 2 examples. Your ability to rank prominently for one over the other will depend on many factors, including: Internal links pointing to page External links pointing to page Click-throughs from search On-Page optimization Etc... etc... However, I believe the most important factor (tying this all together) will be the user's search query. If Google thinks a user is looking for a specific product, example.com/product/feature might have the most ranking potential. If Google thinks users are looking for an answer to a question it thinks your blog post answers well, example.com/blog/blog-post-1 might have the most ranking potential. I don't know that I would ever "ignore" opportunities to optimize for a specific keyword or phrase, regardless of the page or post type.

    Intermediate & Advanced SEO | | brianglassman
    1

  • WIthout a specific example, there are a couple of options here. I am going to assume that you have an ecommerce site where parameters are being used for sort functions on search results or different options on a given product. I know you may not be able to do this, but using parameters in this case is just a bad idea to start with. If you can (and I know this can be difficult) find a way to rework this so that your site functions without the use of parameters. You could use canonicals, but then Google would still be crawling all those pages and then go through the process of using the canonical link to find out what page is canonical. That is a big waste of Google's time. Why waste Googlebots time on crawling a bunch of pages that you do not want to have crawled anyway? I would rather Googlebot focus on crawling your most important pages. You can use the robots.txt file to stop Google from crawling sections of your site.  The only issue with this is that if some of your pages with a bunch of parameters in them are ranking, once you tell Google to stop crawling it, you would then lose traffic. It is not that Google does not "like" robot.txt to block them, or that they do not "like" the use of the canonical tag, it is just that there are directives that Google will follow in a certain way and so if not implemented correctly or in the wrong sequence can cause negative results because you have basically told Google to do something without fully understanding what will happen. Here is what I would do.  Long version for long term success Look at Google Analytics (or other Analytics) and Moz tools and see what pages are ranking and sending you traffic. Make note of your results. Think of the most simple way that you could organize your site that would be logical to your users and would allow Google to crawl every page you deem important. Creating a hierarchical sitemap is a good way to do this. How does this relate to what you found in #1. Rework your URL structure to reflect what you found in #2 without using parameters. If you have to use parameters, then make sure Google can crawl your basic sitemap without using any of the parameters. Use robots.txt to then block the crawling of any parameters on your site. You have now ensured that Google can crawl and will rank pages without parameters and you are not hiding any important pages or page information on a page that uses parameters. There are other reasons not to use parameters (e.g. easier for users remember, tend to be shorter, etc), so think about if you want to get rid of them. 301 redirect all your main traffic pages from the old URL structure to the new URL structure.  Show 404s for all the old pages including the ones with parameters.  That way all the good pages will move to the new URL structure and the bad ones will go away. Now, if you are stuck using parameters. I would do a variant of the above.  Still see if there are any important or well ranked pages that use parameters. Consider if there is a way to use the canonical on those pages to get Google to the right page to know what should rank. All the other pages I would use the noindex directive to get them out of the Google index, then later use robots to block Google crawling them.  You want to do this in sequence as if you block Google first, it will never see the noindex directive. Now, everything I said above is generally "correct" but depending on your situation, things may need to be tweaked. I hope the information I gave might help with you being able to work out the best options for what works for your site and your customers. Good luck!

    Intermediate & Advanced SEO | | CleverPhD
    0

  • Thanks for the reply - this was the top organic search result.

    Intermediate & Advanced SEO | | Parker818
    0

  • Hi there! i agree with Patrick. I was going to recommend using Screaming Frog or Google Search Console! Let me know if you try these, don't like them, and need another recommendation.

    Technical SEO Issues | | BlueCorona
    0

  • Like Drones already mentioned, I would try to figure out what search engines could potentially still know about the history of your site. So make sure to check through all the available tools: Google Search Console (if you already own the domain), Majestic SEO, Open Site Explorer and Ahrefs what kind of domains are linking to your site and what type of links they are. Relevant or non relevant. If the majority of them is it probably will harm you in this case as you have too many non relevant links pointing to your site. If you barely can't find anything about it, I wouldn't worry too much about it.

    Intermediate & Advanced SEO | | Martijn_Scheijbeler
    0

  • ok Patrick Delehanty Thank you so much

    Technical SEO Issues | | innovative1003
    0

  • Hi there! Here are some suggestions to get your company ranking on page one AND outranking the competition: Local SEO - I know you know how information this is, but don't push this to the back burner. Get your local citations as tight and consistent as they can possibly be. Content - This is an obvious one, but continue creating new and unique content for your website! Make sure you are generating content that is helpful for users and proves that you are THE authority in the area. The more direct your responses, the more likely you are to rank—and maybe even land a Google direct answers spot! Videos - Google likes to display a wide variety of media in its search results—including videos. A video is 60 percent more likely to get ranked in a search result than a landing page on the same topic. Images - Optimizing the images on your site can help you get on the first page of Google when it displays image results. Microsites - A microsite (sometimes called a minisite) is a website used to supplement a company or organization’s primary domain. More often than not, the microsite will have a URL distinct of the primary domain and has its own unique design and navigation. Microsites can help you target different buyer personas, appear more relevant and authoritative, as well as get multiple listings in organic search results. News - Putting press releases out for your business can potentially score you a spot in the News section of Google, particularly for branded terms. We wrote a blog about getting your business on the first page of Google, so be sure to read that for more in depth information than I shared above! Let me know if you have any other questions—hope this helped!

    Technical SEO Issues | | BlueCorona
    0