Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Thanks for clarifying. In terms of weight, I don't think there is any significant difference between your 2 examples. Your ability to rank prominently for one over the other will depend on many factors, including: Internal links pointing to page External links pointing to page Click-throughs from search On-Page optimization Etc... etc... However, I believe the most important factor (tying this all together) will be the user's search query. If Google thinks a user is looking for a specific product, example.com/product/feature might have the most ranking potential. If Google thinks users are looking for an answer to a question it thinks your blog post answers well, example.com/blog/blog-post-1 might have the most ranking potential. I don't know that I would ever "ignore" opportunities to optimize for a specific keyword or phrase, regardless of the page or post type.

    | brianglassman
    1

  • Hi, For me it's a very important part of SEO. An XML sitemap creates a hierarchy of your site for bots and search engines to follow. It ensures that the spiders can easily reach all parts of your website and indexes your pages quickly. Hope this helps. Thanks

    | Alick300
    0

  • Hi Brian, Piggybacking off Kevin's answer, I agree. Typical FAQ questions and answers are all on one page, but if your answers will have enough content (depending on the question), then I would suggest building out individual pages. Some tips, make sure you directly write out the question be answered and then concisely answer it. The better your answer, the great the opportunity you have to rank in Postion 0 on Google. An FAQ page is a great opportunity to grab this position -- which puts you above all the other organic listings. Read more on Position 0 here: http://www.bluecorona.com/blog/position-0-google-rankings/amp/ To increase your chances of being featured as one of Google’s rich answers, you should: Identify a simple question and include that question within the text of your page Provide a direct answer right after the question Offer valued added info Make it easy for users (and Google) to find Use ordered lists, bullets, or tables that Google can read to take up more real estate in the “zero” ranking spot Learn more about featured snippets and rich snippets from Google » Hope this helps -- let me know if I can help with anything else.

    | BlueCorona
    0

  • Hey Dieter, I might have considered a more dynamic/interactive solution for showing multiple variations/colors of the same product (gallery/Javascript, etc.). My concern would be as follows: Each time you ask a user to refresh or visit a new page in a purchase funnel, you're increasing the likelihood of a lost opportunity. If a single-page option is not possible, you are right to look at a noindex or canonical solution. Personally, I'd recommend the canonical solution. This way, any existing link juice (if any) is being attributed from extraneous to the primary product URL. Best of luck to you. -Brian

    | brianglassman
    1

  • Thanks for the reply - this was the top organic search result.

    | Parker818
    0

  • Hi there If you have ScreamingFrog you can run a crawl on the websites in question to see what they have. Although that could take some time if you do a site one by one. Otherwise, you can do a manual "site:example.com inurl:news", "site:example.com inurl:blog", "site:example.com blog", or "site:example.com news". Either way, this will take some time as I'm not sure there is a tool specifically for this type of task. Might I suggest using tools like Majestic, BuzzSumo, or SEMRush to find a website's top ranked or linked to pages. You can quickly filter through to blog or news articles and see what content is working best for particular sections of sites. From there you can prioritize your efforts and work accordingly. Hope this helps a bit! Good luck! Patrick

    | PatrickDelehanty
    0

  • Hi Michael! I recommend checking out this blog for more insight: http://searchengineland.com/how-many-301s-are-too-many-16960 The video on the blog linked above answers: Is there a limit to how many 301 (Permanent) redirects I can do on a site? How about how many redirects I can chain together? Other things to watch out for with chained redirects: Avoid infinite loops. Browsers may also have redirect limits, and these limits can vary by browser, so multiple redirects may affect regular users in addition to Googlebot. Minimizing redirects can improve page speed Hope this helps!

    | BlueCorona
    1

  • Of course! Thank you. It's been a long week...

    | Bee159
    0

  • Both Andy and Patrick have provided great responses, and took the recommendations right out of my mouth! I agree, LinkedIn is a better platform for you to reach your relevant audience so posting your content there is ideal. Don't worry about posting content that already exists on your website, you shouldn't get dinged for it.

    | BlueCorona
    0

  • WIthout a specific example, there are a couple of options here. I am going to assume that you have an ecommerce site where parameters are being used for sort functions on search results or different options on a given product. I know you may not be able to do this, but using parameters in this case is just a bad idea to start with. If you can (and I know this can be difficult) find a way to rework this so that your site functions without the use of parameters. You could use canonicals, but then Google would still be crawling all those pages and then go through the process of using the canonical link to find out what page is canonical. That is a big waste of Google's time. Why waste Googlebots time on crawling a bunch of pages that you do not want to have crawled anyway? I would rather Googlebot focus on crawling your most important pages. You can use the robots.txt file to stop Google from crawling sections of your site.  The only issue with this is that if some of your pages with a bunch of parameters in them are ranking, once you tell Google to stop crawling it, you would then lose traffic. It is not that Google does not "like" robot.txt to block them, or that they do not "like" the use of the canonical tag, it is just that there are directives that Google will follow in a certain way and so if not implemented correctly or in the wrong sequence can cause negative results because you have basically told Google to do something without fully understanding what will happen. Here is what I would do.  Long version for long term success Look at Google Analytics (or other Analytics) and Moz tools and see what pages are ranking and sending you traffic. Make note of your results. Think of the most simple way that you could organize your site that would be logical to your users and would allow Google to crawl every page you deem important. Creating a hierarchical sitemap is a good way to do this. How does this relate to what you found in #1. Rework your URL structure to reflect what you found in #2 without using parameters. If you have to use parameters, then make sure Google can crawl your basic sitemap without using any of the parameters. Use robots.txt to then block the crawling of any parameters on your site. You have now ensured that Google can crawl and will rank pages without parameters and you are not hiding any important pages or page information on a page that uses parameters. There are other reasons not to use parameters (e.g. easier for users remember, tend to be shorter, etc), so think about if you want to get rid of them. 301 redirect all your main traffic pages from the old URL structure to the new URL structure.  Show 404s for all the old pages including the ones with parameters.  That way all the good pages will move to the new URL structure and the bad ones will go away. Now, if you are stuck using parameters. I would do a variant of the above.  Still see if there are any important or well ranked pages that use parameters. Consider if there is a way to use the canonical on those pages to get Google to the right page to know what should rank. All the other pages I would use the noindex directive to get them out of the Google index, then later use robots to block Google crawling them.  You want to do this in sequence as if you block Google first, it will never see the noindex directive. Now, everything I said above is generally "correct" but depending on your situation, things may need to be tweaked. I hope the information I gave might help with you being able to work out the best options for what works for your site and your customers. Good luck!

    | CleverPhD
    0

  • Like Drones already mentioned, I would try to figure out what search engines could potentially still know about the history of your site. So make sure to check through all the available tools: Google Search Console (if you already own the domain), Majestic SEO, Open Site Explorer and Ahrefs what kind of domains are linking to your site and what type of links they are. Relevant or non relevant. If the majority of them is it probably will harm you in this case as you have too many non relevant links pointing to your site. If you barely can't find anything about it, I wouldn't worry too much about it.

    | Martijn_Scheijbeler
    0

  • For anyone stumbling across this we were able to identify the issue. We had a WordPress installation that had a hidden database full of SPAM and bad links/images etc. The Database could not be located in MySQL which was interesting, so we had the client agree to allow us to delete ALL content and start from scratch. (They had previously been hacked with another vendor but owned their WordPress website). We were migrating the database because they had hundreds of blog posts. We all agreed since they weren't doing any good we would start from scratch. Deleted entire website off of server, reinstalled WordPress and we've seen a 745% improvement in keyword rankings. We have over 20 keywords now ranking on page 1. Woot!

    | Tosten
    0

  • I completely agree with Patrick you get all the domain authority and back links from utilizing a subfolder in place of a subdomain. Google considers a subdomain separate website. Therefore subfolder should almost always be used for very unique cases. Hope that is enough to convince you to use the subfolder. All the best, Tom

    | BlueprintMarketing
    0

  • Thanks, yea didn't think it would be.

    | GregDixson
    0

  • Hi there If the PDF's URLs are changing, then yes you should redirect the old URLs to the new URLs. Even more so than a search engine perspective, if users have bookmarked or saved the link to your content, they need to be redirected. My vote - include them, be on the safe side. Also make sure internal links and the sitemap XML is updated to include the new URLs. Hope this helps! Good luck! Patrick

    | PatrickDelehanty
    0

  • Agreed, the redirects/canonicals should be permanent (well, for as long as you want the authority to pass along). You would see changes in serps within 2 weeks usually.

    | OlegKorneitchouk
    1

  • Hi Kian_moz I feel it depends on how you want to change and how big the impact of changing URL as well as title tag. My situation was quite similar to you and I've changed one of my blog post URL and now the page has got 9th position which is much better than before middle of the 2nd page of SERP. But I lost 36 facebook likes, 2 facebook share. My blog started few month ago and not popular yet so impact was small and now I'm able to get more clicks so changing URL worked fine for me.

    | Yuki-hero
    0

  • For me Yoast is the best, I'm using the premium version for my blogs Rankie - Wordpress Rank Tracker Plugin (good for keyword tracking) Rich Snippets WordPress Plugin (good for structured content)

    | Roman-Delcarmen
    0

  • Hey Andy, Somewhat heartening to know we're not alone, but I share your pain! My client went from a Domain Authority of 16 in December to a DA of 1 after moving to https. They have finally bounced back to 11, but it has taken 3 months AND they still only have 2 referring domains linking to the https version. There are 53 referring domains still linking to the http version. In fact, this has risen from 40 referring domains in the last month. So how can the number of referrers to http be increasing, when there is an https verion? I just don't get it! Anyone know (a) how this can happen and (b) how to address the issue?

    | muzzmoz
    0