Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • I agree with Andy, except that I would set the categories and the navigation bar for usability over pleasing the search engines. Blasphemy, I know on an SEO Q&A forum. Seriously, though, your #1 goal is to keep it as simple and uncomplicated as possible for users to find your products. SEO aids that, but it is not the goal in and of itself. The goal is more clients/sales/users. If you haven't done this already, put your categories/navigation sections on 3x5 cards or sticky notes, and then arrange them on a table or wall. Cross-reference the navigation words you are using with the search terms by which people look for this product to make sure they are user-friendly What taxonomy works and would make sense to your users in terms of getting to your products the easiest and fastest? Then use that one. A complicated taxonomy that makes sense to robots but not to people will increase your bounce rates and not your sales. So...go for the users. -- Jewel

    Technical SEO Issues | | impactzoneco
    0

  • An internal page can absolutely compete against a home page. Just tonight I was doing some research, and 4 of the sites on Google's front page were internal pages.

    Local Strategy | | julie-getonthemap
    0

  • If we are using dynamic coding so that the search results page is just one page of code for ALL cities, then I assume that we need to just add dynamic logic at the top of the page for the tag to be dynamic based on the city URL in the URL, right? Then when this one page loads for any city and any url parameter, it will just be custom, for example: URL: domain.com/tx/austin?urlparameters..... LINK: domain.com/tx/austin URL: domain.com/ca/los-angeles?urlparameters..... LINK: domain.com/ca/los-angeles Is that correct?  This will just be dynamic logic in our codebase on the one template?

    Technical SEO Issues | | ErnieB
    0

  • Hi GaryBES! Did you see Hurf's thoughtful reply to your question? Please let us know if you were able to sort things out or not -- and if Hurf's response helped answer your question, please mark it as a "good response." Thanks! Christy

    Online Marketing Tools | | Christy-Correll
    0

  • I am not an SEO master but personally, I think that disawowing these links would be ideal.  According to the Alexa Statistics this website is not that bad: Global rank of 237,621. However, I think that most of the site's content don't contain very important information. I mean the blogs don't have a valuable content. Having a link in this website might help you in certain ways but generally I don't think it's a good idea to have links on this website. Source: http://www.alexa.com/siteinfo/hotel-online.com

    Link Building | | ahmetkul
    0

  • Hi Beachflower, Did you ever get a resolution to this? I'm curious to see what the outcome and solution was. If this is a malicious attack then you'll need to consult someone who specializes in net sec, but I've dealt with a few different kinds of attacks before so I can make a couple of recommendations. 1. Change all of your logins. Make them unique and difficult for a bot to guess. Then set it to lock out users after five incorrect guesses. This prevents brute force hacks. 2. Add a honeypot to your login forms. A honeypot is a hidden field that bots will try to fill out on a form. Users can't see it, so they don't fill it out. If it gets filled out, the program knows it's a bot, and invalidates the attempt to login. 3. Use screaming frog to find all the js that was maliciously inserted on each URL and create a "cleanup" list. A developer should be able to write a simple "find and replace" program that just deletes it. 4. Consider migrating to https if you haven't already. This can prevent Man-in-the-Middle attacks (MIM) on your site, and also confers several SEO benefits such as improved user experience, a slight boost in ranking, and faster site speed (HTTP/2 integration). These are just a few first steps to take and a Net Sec professional will have much more to add. Hope that helps!

    On-Page / Site Optimization | | brettmandoes
    0

  • Hi there! My sincerest apologies for not seeing this last response! We do consider our figures up to date for the sites that we have indexed, but the indexing process can make this seem like it isn't the case. The reason you haven't yet been indexed is not because you have a large site but simply because we haven't discovered your site yet. Just a few points on how we compile our index: We grab the most recent index. We take the top 10 billion URLs with the highest MozRank (with a fixed limit on some of the larger domains). We start crawling from the top down until we've crawled ~130 billion URLs The idea here is that we're focusing on the highest-quality links we can find, coming from the most prominent pages of authoritative sites. So, while you may not see every link for a site within our index, we're aiming to report the most valuable ones available. Most new sites and links will be indexed by our spiders and available in Mozscape and Open Site Explorer within 60 days, but some take even longer for many reasons - including the crawl-ability of sites, the number of inbound links to them, and the depth of pages in subdirectories. You can see our most recently updated schedule here as well as some more technical metrics on our Mozscape API Updates page. You can also see when the last and next updates happened on the Open Site Explorer (OSE) homepage at any time. Since Moz focuses on quality of links over quantity, we are always focused on the most relevant links to display to our users. It's possible that Moz's index will leave out some of the lower-quality (non-link juice providing) links out of our index because of this. So, that might explain why you may see some discrepancies with what other tools may be showing.

    Getting Started | | samantha.chapman
    0

  • It seems you didn't get my question. There will be no duplicate content in pages. All I am discussing is about how to optimise sub-domains pages with "keyword" and "brand" at page titles. Generally we have given "brand & keyword" in all pages of website. Can we do same for pages of sub-domains? Is so many pages added with "brand & keyword" helps or kills in ranking?

    Intermediate & Advanced SEO | | vtmoz
    0

  • Thanks for the great response, some really useful thoughts. To address your final point, the site is considerably stronger than the content creator's so it's reassuring to hear that this could be the case. Of course we'll be recommending that as much of the data as possible is curated and that the pages are improved with original content/

    Technical SEO Issues | | BackPack85
    1

  • Alick300 and Hurf, thanks so much for the answers and especially the resources you shared! Very helpful!

    White Hat / Black Hat SEO | | Scratch_MM
    0

  • Phew, what an adventure a user must be going on each time! Now the horror part of the adventure - audit! - ARGH! That fun time in an SEO's life when they get elbow deep into numbers etc. The best strategy is to find all the links and rejig them  rather than have link to link to link etc. just cut out the middle man and go direct. Assuming that the link is even worth doing that for! but to directly answer: 1. The original domain is still there so the value is still there, it's just going to a dead domain, if you repoint that should still work but this is a bit of a grey area! 2. Same as if they went to a 404 on a live site, they are still there just not going anywhere, you resolve it the same way if you had a 404 page with a redirect that can benefit a user looking for a resource. 3. Yes, you just need to go to the original source and ensure it points to the correct place. Its all a bit of a grey area and may  do more harm than good especially if the links are a bit dodgy but you can still move a link that points from A to B same as you would an internal link with a 404.Hope that helps and good luck! (formatting edit, seems to have gotten lost!)

    Intermediate & Advanced SEO | | GPainter
    0

  • Its not about the canonical, its about the crawl optimization. I know that canonical URL saves the situation here, i am working under a fail safe mode in matter of duplicates and i want to believe that the canonical URL implementation is better than good in my website. I just don't want bot's spending time on pages that have nothing actual to say and are canonicalized to pages that have the important content. That is why i configured the bot to not crawl those parameters in the URL parameters tab in GWT and eventually some time to even drop those results.

    Technical SEO Issues | | dos0659
    0

  • Make sure you're doing this for the right reasons: Don't do this to in an effort to improve rankings; do it because it's it improves the user experience, particularly if the site is sizeable. Adding an HTML version of the sitemap can help users find what they want on your site as quickly as possible. This, will reduce bounce rates and increase time on site, which can be a signal that your site delivers content that is relevant to the user's search (which is Google's primary objective - if they deliver relevant search results, people will continue to use their service).  Everything you do to help your visitors find what they are looking for (be that a product or information) as quickly and as painlessly as possible will benefit you directly. Google will reward you for that. "Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page)." If it's a large site, you may want to break this down over several pages. - "Limit the number of links on a page to a reasonable number (a few thousand at most)." I'd be inclined to add a link to this in the footer of the site and/or in an on-site search page. Source Google Webmaster Guidelines (under Help Google Find Your Pages section): https://support.google.com/webmasters/answer/35769?hl=en Good luck.

    Search Engine Trends | | Hurf
    0