Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi Jay, Set login/account pages to not be indexed. Both way are equaly, it depends on how easy is for you any or other way. Personally I prefer robots.txt, just one archive and not messing with page code. Best luck. GR.

    Intermediate & Advanced SEO | | GastonRiera
    0

  • No problem at all, happy to help. Unfortunately the best tools that we have to evaluate these are tools like Open Site Explorer which try to emulate how Google looks at links but they're imperfect for the very same reason that I can't possibly give you a definitive answer: Google doesn't want us to know! Unfortunately, the only way we can ever know the outcome is to implement the change and see if the rankings get better or worse - welcome to the struggles of SEO! If you really can't afford to be taking a hit right now but it would be more acceptable in a month or two (e.g. right now is your busiest period) I'd be inclined to wait. Otherwise, it's a tough call but I'd still lean toward having them removed. Don't forget that Google has been promising a Penguin (backlinks) update "very soon" all year! If that damn update finally rolls out tomorrow you may find yourself getting slammed by it... or it could roll out next year... or maybe it'll roll out and you'll be fine. Sigh. We have had success in doing it steadily with one of our larger clients who were in a similar situation and the results were as good as we could have hoped for but YMMV. We essentially did the removal in stages. We divided the bad domains up into batches then contacted the first batch requesting removal then disavowing. While all this was happening we also got to work building quality links to the site as well so they roughly cancelled each other out. Then we did the same thing with the other batches of bad links until we'd been through the lot. For us, the end result was a series of fairly marginal peaks and troughs that directly correlated with link removal and link acquisition so the net position at any given time was approximately the same. I must stress though that YMMV here - since I have a total data sample of 2 domains (this client has 2 companies/sites), it's impossible for me to say with absolute certainty that what I saw is the direct result of our process.

    Intermediate & Advanced SEO | | ChrisAshton
    0

  • Hey It depends on a few factors here: 1. What is the desired geographic target for each of these? I am assuming the .au and .co.uk are for Oz and UK so geo target them (auto) in webmaster tools and sign those off 2. The .com is a generic so could be competing - ensure you geo target this to the US if that is the desired geography. This ensures they are not competing with each other and there will be no duplication issues but is based on the US, Australia and UK assumption. Hope that helps Marcus

    Intermediate & Advanced SEO | | Marcus_Miller
    0

  • Thank you for the valuable suggestions

    On-Page / Site Optimization | | EBSAR
    0

  • Keep it as precise as possible, whether you disavow the whole domain or not is your choice, no problem doing that if you need to. However if you are sure there is literally only one link on the site it is probably advisable to only disavow the specific URL. The best of my knowledge there is no problem with disavowing a full domain. As ThompsonPaul as said if you disavow the whole domain it will affect the entire domain including subdomains so treat with respect. The other thing is be sure the link is doing you harm before you remove it, I have seen are even so-called spam links knock a site by a few points if it's disavowed.

    Intermediate & Advanced SEO | | seoman10
    0

  • Hi Lance, If you do end up finding Flash on your site  using http://seositecheckup.com/tools/flash-test These links might also be helpful https://moz.com/blog/flash-and-seo-compelling-reasons-why-search-engines-flash-still-dont-mix ( Rand's opinion about Flash and SEO ) http://www.toprankblog.com/2009/11/seo-for-flash-tips/  ( Existing Flash SEO) SE-Flash.com This tool that visually shows what from your Flash files is visible to search engines and what is not. This tool is very useful, even if you already have the Flash Search Engine SDK installed because it provides one more check of the accuracy of the extracted text. Besides, it is not certain that Google and the other search engines use Flash Search Engine SDK to get contents from a Flash file, so this tool might give completely different results from those that the SDK will produce. I hope this helps, please feel free to respond if you have additional questions. Regards, Vijay

    Link Explorer | | Vijay-Gaur
    0

  • I am happy to hear that the community is still a big priority. I personally think that Moz SEO software has become better and better. I use it along with other tools. Things like the content tool and the new keyword tool are really outstanding. Moz analytics has been improving and is a quality tool. I cannot imagine one platform providing everything we need today for search but I am happy with Moz and other tools for unique purposes. My hope is that the community keeps getting better. I'm glad to hear It is considered a priority as well. Thank you or providing such quick feedback. Respectfully, Thomas

    Moz News | | BlueprintMarketing
    6

  • Hi gygnyc! Have you purchased a paid Moz Local account? If not, I would go ahead and do that and you will be able to provide Moz with the correct URL for your Facebook page. Sometimes it takes a little while for Moz local to pick up on the listing -- even if you provided them with the exact URL. I would give it some time, continue to monitor it, and then if it still does not pick up the Facebook listing I would reach out directly to the support team. Hope this helps!

    Moz Local | | BlueCorona
    0

  • Hi Ira! Mike Blumenthal wrote about this same question at LocalU in 2014. See here. The discussion in the comments there is quite good.

    Reviews and Ratings | | MiriamEllis
    1

  • Hi Surreyroofcare! Sometimes it takes a little while for Moz local to pick up on the listing -- even if you provided them with the exact URL. I would give it some time, continue to monitor it, and then if it still does not pick up the Facebook listing I would reach out directly to the support team. Hope this helps!

    Moz Local | | BlueCorona
    0

  • This quote is from Moz's domain setup guide: "Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website)." I think that quote is pretty compelling towards the idea of subdirectories. But I think the value of subfolders versus directories certainly makes sense especially from a linking, age, and juice perspective. Ultimately, it comes down to you and what you want for the website.

    Technical SEO Issues | | BlueCorona
    0

  • Yes I agree with Paul, the www and non-www are still resolving separately, this needs to be fixed and is part of your problem, assuming the site in question is TapGoods which you never confirmed.

    Moz Tools | | Joe.Robison
    1

  • Thanks ! I turned of Geolocate (with page caching support), and as you said, it corrected the problem. Thanks again. Bob

    Technical SEO Issues | | DML-Tampa
    0

  • thanks! someone in another thread suggested CrawlMonster - so far "meh" - I prefer MOZ and ScreamingFrog. Anyone suggest any other tools for managing this process?

    Technical SEO Issues | | seo_plus
    0

  • HI there, Remember that rankings movements, sometimes don't have much to do with algorithm updates. There are thousands of reason and variables to analyze. About updates you can have very updated info here: mozcast.com Algoroo.com Best luck!

    Technical SEO Issues | | GastonRiera
    0

  • Nope. No need for images. They just know about the content and link to it. The cached HTML shows they store a copy (or cache) of the HTML though. I could be wrong about the images but that would exponentially increase their storage needs so it seems unlikely.

    Technical SEO Issues | | Marcus_Miller
    0

  • You can't get their domain validated in your Google Search Console account (unless they add you as a user to their account), however there is a workaround more or less described here: Getting CDN images indexed with Google In short: If your image is for example on https://12345.maxcdn.com/images/directory/subdirectory/image-filename.jpg You can add a CNAME (alternate domain name) to make the URLs formatted like this: http://images.yourdomain.com/images/directory/subdirectory/image-filename.jpg After that you can add "images.yourdomain.com" as a verified domain in Google Search Console and treat it as a new subdomain, incl. sitemaps etc. Hope this helps

    Technical SEO Issues | | ConclusionDigital
    0

  • What software system is your website built with, Archie? If it's a Content Management System like WordPress, Drupal or many others, it's entirely likely that there is an SEO plugin that will make implementing canonical tags for all your URLs very easy.

    Technical SEO Issues | | ThompsonPaul
    0