Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Yes correct the blog is site.com/blog and has high quality content with many high ranking pages and high quality score. I will clean it up and try to maintain it again as the blog has not been looked after for a while and can be a good source for extra traffic. Many thanks for all comments it really helps!

    On-Page / Site Optimization | | bill369
    0

  • No, well-presented, user-friendly popups like exit-intent popups won't have anything to do with the ad blocking that's going to happen.

    Search Engine Trends | | ThompsonPaul
    0

  • I see this as good news. Sites break and pages go missing (according to Google's crawl) from time to time. I was concerned at first that permanent redirects were added by mistake. I would give it some time and default to Google's usual advice for fixing 404s - which is to just fix the error and monitor in search console.

    Technical SEO Issues | | Eric.W.Caudle
    0

  • Same reason there's a difference between the number of backlinks between SEMrush and Majestic, Majestic and Ahrefs, Ahrefs and MOZ or all of the third-party backlink tools and Search Console. These tools scan the web at different paces and start from different places. That's why some of them find backlinks that others don't.

    Link Explorer | | Igor.Go
    0

  • Although your site appears to load fine, Pingdom reported a load time of 6.41S. You have a number of redirect chains, scripts that could be combined, large request size, etc. Most of these things are easy to fix. There are many services out there where you can get a decent report. The report will give you a good list of fixes to make. I use https://tools.pingdom.com I've seen page load time (especially after recent site changes) adversely affect site ranking more than any other single cause.

    Technical SEO Issues | | Eric.W.Caudle
    0

  • ITS Not about loss of Linkjuice. ITS Not cool to Link sitewide in Footer to other Domains and link Back. My best practice: is the Link clicked by Users .. helpfull 4 Them: nofollow. Is the Link Not clicked or Not helpfull, Just exists to have links: delete

    Technical SEO Issues | | paints-n-design
    0

  • So by siloing content you mean creating categories and subcategories like in a library.  So Reference, fiction, magazines etc. And then having subsections.  The old way to 'rank' for something was to pick up keywords for the smaller sections and subsections. So to use the example of dentistry. I might have  adult dentistry and paediatric dentistry pages then in adults i'll have implants, braces and veneers.  Then in veneers I might have an article about veneers price, veneers procedure and veneers risks.  All of these would link to each other and link up in the architecture.  And hey-presto, i'd eventually rank for one of the 'top categories' like adult dentistry. The problem with this is that it's going to create internal competition and conflict.  Google doesn't want users having to hop around highly granular subtopics for answers.  They'd rather have the answer to a query all in the same place. So instead I'll now have one single page with everything people need to know about veneers; price, risks, procedure etc. All in one place. Now there are further difficulties because google will sometimes consider two related things as different 'topics' or answering different questions. So I do have page for everything about veneers and also a page about veneers cost. In the case of veneers everyone wants to know the cost.  It's all cost cost cost - so this is it's own topic and it's own page.  But for something like root canals, nobody cares how much they cost, they just want to get out of pain.  So the root canal cost section is on the main root canal page because it's included in the topic of 'root canals'. It's now more about searcher intent https://moz.com/blog/how-google-gives-us-insight-into-searcher-intent-through-the-results-whiteboard-friday, possibly 'searcher task accomplishment' https://moz.com/blog/harnessing-link-equity and also how link equity flows: https://moz.com/blog/harnessing-link-equity Also read this: https://www.cs.cornell.edu/home/kleinber/pcm.pdf it's tough going but just ignore what you don't understand and press on with reading it all and you'll learn a great amount about how google functions. So to answer the question, you still need a solid site structure but i'd say 'siloing' is possibly going to dilute the potential power of each page.  You're going to end up with 30 pages all about sub-sub topics that should be rethought out and consolidated using google as your research tool.  Always use google as your research tool.  To do anything else is like training for a sprint race by going swimming every day. 'Siloing' for me also created a ton of duplicate content, duplicate headlines and I even think I got stung by Maccabees for having some pages about all the different aspects of implant dentistry.  They are now all consolidated into a 'super page' and it's ranking #1 locally and really well nationally too.  Page one. Imagine five pages, H1's are 'braces cost', 'braces procedure', 'braces on finance' and 'braces risks'.  Google is going to struggle - in my view - to rank me for any of those because they all have an H1 containing the word braces. What would be better would be to have  a'braces' page and then the H2's were all those sub sub topics and then an FAQ with all the google suggest words as H2 and then all the 'searchers also asked' words in the FAQ. Hope this helps - this is my interpretation from my small local business here in the UK. So other users here may have more relevant information.  For example IA, cannibalisation, internal conflict etc is much bigger in shopping and information businesses than it is in services businesses. And of course this classic: https://moz.com/blog/optimizing-for-rankbrain-whiteboard-friday  thanks to @miriam ellis for that one.

    Intermediate & Advanced SEO | | Smileworks_Liverpool
    0

  • Thanks Ed for the UK and US analysis. Its very strange as it only affects Australia. Moz outputs: 11.5-30.3K hives with difficulty at 57 11.5-30.3K urticaria with difficulty at 40 So I guess Moz is considering both Hives and Urticaria differently. I am just a bit wary that the range is the same given the method that is used. This article shows how they come up with the range https://moz.com/blog/sweating-details-google-keyword-tool-volume Given Keyword planner (Australia data - see image) gives very different month usage rates - I would expect the range to be different. OpnY6

    Keyword Research | | niritoli
    0

  • Thanks SEOMAN. My issue is that this purposeful duplicate content tactic is helping www.adspecialtyproductscatalog.com rank higher for a keyword like "ad specialty products" than http://www.advertisingproducts.net/, the former, using the same 4,000 words on every single page of it's site. So if the goal is to rank #1 for the highest searched keyword in my category, should one employ this content tactic as part of their SEO efforts?

    White Hat / Black Hat SEO | | KenSchaefer
    0

  • Hi Robert! The safest route here seems to be just adding an extra 301 to the https version. This will gather all the links out there. If you remove some of the middle links and that just happen to be a link to your site you'll be losing a possible good link. Google understands well up to 4-5 hops. So I think that you'll be ok with just these 3 hops. To back what i'm saying: Matt Cutts said it in these videos: Can too many redirects from a single URL have a negative effect on crawling? Is there a limit to how many 301 (Permanent) redirects I can do on a site? Hope it helps. Best luck. GR.

    Technical SEO Issues | | GastonRiera
    0

  • Sometimes, it stops to work may be due to an update. I have faced the same issue in my CDNcare.ca site.

    Moz Local | | fatetmpwcosl
    0

  • Hey mag777! Happy New Year! Great question and one I'm asked a lot from my business advisory groups, clients and referral partners. It's one that undoubtedly always comes up in conversation when talking about either acquiring new domains or revamping a website... "Should we use the WWW or NON-WWW domain?" In my 12+ years of web consulting and SEO, it has boiled down to a preference ideology. As mentioned by seoman10 in this thread, it could be seen as a shorter, easier to remember URL. When marketing the domain, you'd always want to confirm the WWW is redirected to the NON-WWW if this is the path you choose to take. And visa versa NON-WWW --> WWW. As Gaston mentions too, check out your competitors URL structure. It's a quick glimpse up to the address bar while you are already doing your competitive research. Do some searches in Google as well for your website and competitors to see how much of the URL does or could display. Moz.com doesn't use WWW. They could get away with it because it's so short. I on the other hand almost need to with https://whiteboardcreations.com since it's a much longer domain. Keep this in mind for those domains you're working on, too. From an SEO school of thought and how I now operate, I choose the NON-WWW simply because we can get just a little more of the URL to show in Google SERPs, if we're targeting inner pages to provide a hint more of visual for the searcher. The URL string matches more closely to the Title and Description. That is the way I look at this strategy. Here is a quick video from Matt Cutts a few years back... should ease your concerns over redirects   https://youtu.be/Filv4pP-1nw  Either way, the websites you work on for yourself or your clients will be fine as long as you are consistent for the entire site and redirects are tested and confirmed functional. Cheers to a successful 2018 for you and everyone else reading! Patrick @ Whiteboard Creations (Apex, NC)

    Intermediate & Advanced SEO | | WhiteboardCreations
    0

  • Hey Andrei! Tawny from Moz's Help Team here. I took a look at that site, and I don't actually see any H1 tags in the source code for that page: https://www.screencast.com/t/DJEjjTWprW6 I think that's why our tool isn't finding the keyword there — it looks like that tag doesn't exist! I hope this helps! Let me know if you have any other questions or if there's anything that needs clarifying! You can always feel free to drop us a line at help@moz.com.

    Other Research Tools | | tawnycase
    0

  • No worries at all. Where you put the pages in the menu should not impact organic traffic in terms of people coming directly from the SERPs to one of these landing pages, but it could impact the flow of traffic through the website (someone entering on the home page and then not seeing that you have these landing pages underneath and about tab or someplace else). So, ostensibly, this could impact the depth of the visits your website receives. The main point of giving these pages their own navigation heading is to increase on-site awareness that the pages exist. From my work with SABs over the years, I've noticed that it has become an expected standard practice to give these pages their own main menu tab, to be sure they're being found by users for whom specialized content has been created. I don't have any recent studies to prove this out, but it's always been a rule of human usability to stick with formats users are already comfortable with. I, personally, wouldn't be inclined to look for my city's landing page under an 'About' tab, but for an authoritative answer on this for your specific brand, you'd need to conduct a usability test in which you see exactly how users are interacting with your website. Sometimes, the results of those studies are extremely surprising. So, end of the day, it's always up to the owner to decide how he wants to structure his website. What I've tried to offer here would be standard best practice advice. But, the only way to know whether having a unique tab for service city content or putting these pages somewhere else helps/harms usability and conversions is to do a formal study. If you don't want to invest in that right now, you could at least ask a few friends who aren't at all familiar with your site to use it while you watch over their shoulders. You might ask them a question like, "What would you do if you were trying to find out if we serve X city?" and then see how they try to find the answer. Things like that might lend some data to your decision about site navigation.

    Local Website Optimization | | MiriamEllis
    1

  • I doubt that you can get a definitive answer from a non-googler (and googlers' non-disclosure agreements sometimes make them fairly closed-mouthed), this will depend on how fast the new content is indexed and how soon after that the indexed results get processed by Google's ranking algorithm. To get the fastest results 1) resubmit changed pages for indexing right after they are changed. The only other things you can do is 2) don't limit Googlebot's crawl rate or, if it has been limited, fix that. 3) don't limit (in robots.txt, etc.) what you allow googlebot to crawl. (Who would do that? I did, after watching googlebot crawl some wp folders where I felt it had no business.  Corrected that fairly quickly.) Generally, I see an effect on SERPs fairly quickly - a day or 3. However, if you are in a very competitive market, there's always that brownian motion caused by what your competition does...:)

    On-Page / Site Optimization | | GlennFerrell
    0

  • We can help at Theia Marketing if you're interested!

    Search Engine Trends | | Islin
    2