Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Apologies for the delay in responses here. Thanks Andreas and Mike. We ended up doing just that and redirected 404 errors before doing a recrawl. Worked great! Thanks for your help.

    Intermediate & Advanced SEO | | Gavo
    0

  • New to creating disavow files and I have a few questions: My client site has upwards of 2K bad/spam inbound links. Old site had gotten hacked. Q: Should I submit all of these at once or is it better to upload them in 'batches'. Does it matter? I will not have 2k links in my file as I am using the 'domain:' on alot of them. Some seem to be 'one offs' if you will, from some of the domains in the list and I am listing those separately. Q: is that the proper way to structure my file? Can I have individual URL's in the list along with using the domain: approach? Q: Do I need to include both the "www" and non "www" version of the domains in my list when using domain: ? EXAMPLE: domain:www.exampleone.com domain:exampleone.com domain:www.exampletwo.com domain:exampletwo.com http://www.spammeddomain1.com/path/to/badlink.html http://www.spammeddomain2.com/path/to/badlink.html Is this acceptable??? Thank you in advance

    White Hat / Black Hat SEO | | Critical_IT_Solutions
    0

  • With canonicals, I would not worry about the incoming pages. If the new content is useful and relevant, plus linked to internally, they should do fine in terms of indexation. Use the canonical for now, and once you launch the new pages, well a month after launch, if there are key pages not getting indexed, then you can reassess. The canonical is the right thing to do in this case. As for link equity, you are right, that is a simplistic view of it. It is actually much more intricate than that, but that's a good basic understanding. However, the canonical is not going to hurt your internal link equity. Those links to the different sorting are navigational in nature and the structure will be repeated throughout the site. Google's algo is good at determining internal, editorial links versus those that are navigational in nature. The navigational links don't impact the strength nearly as much as an editorial link. My personal belief is that you are worrying about something that isn't going to make an impact on your organic traffic. Ensure the correct canonicals are in place and launch the new content. If that new content has the same issue with sorting, use canonicals there as well and let Google figure it out. "They" have gotten pretty good at identifying what to keep and what not. If you don't want the sorting pages in there at all, you'll need to do one of the following: Noindex, disallow in robots.txt - Rhea Drysdale showed me a few years back that you can do a disallow and noindex in robots. If you do both, Google gets the command to not only noindex the URLs, but also cannot crawl the content. Noindex, nofollow using meta robots - This would stop all link equity flow from these pages. If you want to attempt to stop flow to these pages, you'll need to nofollow any links to them. The pages can still be crawled however. Noindex, follow - Same as above but internal link equity would still flow. Again, if you want to attempt to cut off link equity to these sorting pages, any links to them would need to be nofollowed. Disallow in robots - This would stop them from crawling the content, but the URLs could technically still be indexed. Personally, I believe trying to manage link equity using nofollow is a waste of time. You more than likely have other things that could be making larger impacts. The choice is yours however and I always recommend testing anything to see if it makes an impact.

    Intermediate & Advanced SEO | | katemorris
    0

  • I recommend this thread over at Linda Buquet's forum that was created when Google removed the location search from main search: http://www.localsearchforum.com/google-local-important/38549-9-tools-hacks-emulate-google-search-location-since-setting-gone.html

    Local Listings | | MiriamEllis
    0

  • Hi Martijn Scheijbeler, First of all thanks for your response really appriciate your help. anyway yes same site map submitted in Bing Webmaster and Google Webmaster ... Google indexed 14k and Bing indexed 1100 only and yahoo 90 only... i use Alexa and Moz Tool to monitor Site performance.. if you analyse then you can find ... please let me know what should i do next..

    Online Marketing Tools | | blackbowchauffeur
    0

  • Looking forward to seeing those changes and improvements Rand

    Moz Tools | | wearehappymedia
    0

  • Having different file types in of URLs in your site doesn't affect SEO. It may confuse  you, and you may inadvertently link internally to pages that don't exist on your site, but there is no effect on SEO. There is a way to parse HTML files as if they were PHP pages--but nowadays I don't recommend that. If you do have HTML pages on your site, you may just want to leave them alone. But if you do think you need to change or update their functionality, then setting up a 301 Permanent Redirect from the old HTML page to a new .PHP page, for example, will do just fine. It should not have any effect on rankings.

    Local Website Optimization | | GlobeRunner
    0

  • Search engines won't index anchor URLs so that's not an option. I agree.   And, I would try to redirect to a URL that the search engines already have in their index.

    Intermediate & Advanced SEO | | EGOL
    0

  • The quality of the interview will determine if people flock to your site.  It could be a rare opportunity to bring new clients, sponsors, or evangelists.   Prepare for them.  You have an opportunity to make people say WOW! or snore. Consider the topic of the interview and the content of your site, and decide what you want people to see when they land. Have your best and most relevant content clearly promoted. If there is great content for this audience that you don't have but could have, do whatever you can to get it.

    Interviews | | EGOL
    0

  • Oleg, Thank you very much. Still learning the ins and outs of Moz. I appreciate your answer. Denis

    Feature Requests | | DenisZilberberg
    0

  • Yes you want to have it match the canonical tag so most effective method is to 301 redirect so they match the canonical tag site map and robots.txt etc.  You can use a Regex code like this at the end of the URL /?$ in the case of category URLs it will allow them when needed. if you use the proper 301 you will not have to deal with the category issue anyway. rel="canonical" href="https://moz.com/community/q/duplicate-content-on-url-trailing-slash" /> I hope this is able to shed more light on the issue and great answer Eric. Hope I was of help, Tom

    Intermediate & Advanced SEO | | BlueprintMarketing
    1

  • Whenever you have products that are similar (but only different in color variations or size variations), you should use the canonical tag to specify this. Keep these URLs indexed, but generally speaking the canonical tag is there to help in these situations. There are literally thousands (or hundreds of thousands?) of sites using the canonical tag successfully.

    Intermediate & Advanced SEO | | GlobeRunner
    0

  • In Google Search Console, you'll want to verify ALL of those versions of your site. Then, in each version, tell Google which version that you prefer--the www or non-www version of your site. So, verify these: http://www. http:// https://www. https:// Choose to use ONE of those versions of your site. Then set up 301 redirects from all other versions to that version. Even though the redirects are in place you'll still want to verify them all in GSC.

    Intermediate & Advanced SEO | | GlobeRunner
    0

  • Hi Michael! There are 3 common approaches to your scenario which can be described as follows: OPTION 1 This represents a very basic, good structure to be used when all service cities are deemed of equal importance: Build a unique landing page for each city served, optimized for each city + general info about your work in that city Build a unique landing page for each service, optimized for each service but not optimized for geo terms. OPTION 2 Given Google's extreme bias toward physical location, this option can be used to maximize your optimization for your city of location, while still giving secondary focus to additional service cities where you lack a physical location: Build a unique landing page for each city served, optimized for each city + general info about your work in that city Build a unique landing page for each service, optimized for each service and also optimized for your city of location, strengthening the association between your services and your core city. OPTION 3 This option should only be considered by companies with significant funding and exceptional creative resources that will ensure that all pages are unique and useful rather than duplicative, thin and harmful: Build a unique page for every possible keyword/geo combination. So: Cloud Computing Sherman Oaks Cloud Computing Van Nuys Computer Repair Sherman Oaks Computer Repair Van Nuys etc. *Again, this last approach should only be undertaken if you are positive the content you'll be developing has a definite purpose for users and that you won't end up weakening your website with a big menu of weak pages. Options 1 & 2 tend to be the best bet for smaller companies with reasonable resources. Option 3 can work, but only where creative possibilities and big budget are available. Hope this helps lay this out in a way that makes sense!

    Local Website Optimization | | MiriamEllis
    0

  • We generally recommend keeping all of that content on the website, there are only a few cases where you would want to remove the content (for example if there are copyright or legal issues involved). Your site, over time, will become larger, and this is a good thing. Fashion trends tend to come back, so in 5 or 10 years if you still have that content on the site it may become relevant again. And, if it's been there for 10 years then there is a good chance that it will rank well--because it's been there 10 years and it's trusted.

    On-Page / Site Optimization | | GlobeRunner
    1

  • If you look at those links in Google Search Console (crawl errors), you'll see that there is a date there. If there are some that show up with an older date (than yesterday, for example), you can mark as fixed. If those errors are still there, then they'll show up again.

    Technical SEO Issues | | GlobeRunner
    0