Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Ben, I doubt that crawlers are going to access the robots.txt file for each request, but they still have to validate any url they find against the list of the blocked ones. Glad to help, Don

    Intermediate & Advanced SEO | | donford
    0

  • In that case, you'd need to add the robots meta tag at the page level before the tag. or

    Technical SEO Issues | | LauraSultan
    0

  • Hi Endre! If you haven't you may want to try running a _full _SERP report on those terms using the blue button in the upper-right of that screenshot. It compares a lot more ranking factors, and might give you a better idea of what's going of.

    Technical SEO Issues | | MattRoney
    0

  • Hi there. My thought is that it was just a coincidence. Usually, responsiveness doesn't affect rankings that dramatically, especially on desktop devices. There are many-many far more important ranking signals than being mobile-friendly. Make sure that your website is properly optimized, you got good backlink profile etc. Also I suggest to look at your competitors. If they have improved, or there were new companies, which are better optimized than yours, and, at the same time your website is underoptimized - your rankings will go down. Hope this helps.

    Intermediate & Advanced SEO | | DmitriiK
    0

  • It looks like the links were on the pages when Moz' crawler crawled them, but have since been removed. The site you link to above mentions that they're having some WordPress problems, which is a clue. What I suspect happened is that links to your competitor's site were part of some kind of link injection malware that was abusing some exploitable WordPress feature common to comics blogs. Often when a bot is injecting links to a bunch of sites onto vulnerable pages, some of those links will be the bot's "customers" - that is, people who have paid for their links to be put on a bunch of sites and aren't too picky about how they get there - and some will have just been pulled in to the mix so that the spammers can mask their client list. So it may not be that this is something your competitor was doing on purpose; it's hard to know for sure. Either way, it's far more likely to be bad news for their site than good news, but probably just won't have any effect one way or another. Since the links aren't there anymore, it's likely that the site owners caught and fixed the bug so those links disappeared. If this was happening to your site, I'd recommend disavowing the sites in question and keeping an eye on your backlink profile for future weird links like these. Since it's happening for a competitor's site, though, you can just ignore those links entirely and focus on the backlinks of theirs that are actually relevant that you might want to pursue.

    Local Website Optimization | | RuthBurrReedy
    0

  • A common post-Penguin misconception is to avoid using important keywords in anchor text at all. You should avoid using the same keyword-stuffed anchor text in a bunch of backlinks to your site, but it's still a best practice to include important keywords in anchor text. Just don't be spammy about it. Make it natural and vary the anchor text. It shouldn't be a problem on your own website as long as it isn't done in an unnatural way that isn't helpful to the reader at all.

    Intermediate & Advanced SEO | | LauraSultan
    0

  • Hi Cristina! Did Miriam's response answer your question? If so, please mark it as a "Good Answer." Otherwise, is there any way we can still help?

    Local Listings | | MattRoney
    0

  • Thanks. That is a good idea and we are already doing it. Moz scan gave us plenty of warnings of duplicate content so I fixed it.

    Content & Blogging | | SirMax
    0

  • Hi Radi! Have Matt and/or Martijn answered your question? If so, please mark one or both of their responses "Good Answer." Otherwise, what's still tripping you up?

    Intermediate & Advanced SEO | | MattRoney
    0

  • Hi Kerry, If you use 301, then the no index no follow rule will never be read. That is because as soon as the page is requested the server redirects, in such case the meta rule tags in the html are never read. So in short I wouldn't worry about it if you're 301'ing. You should however make sure you update any sitemaps you maybe using and change your internal linking to use the new url opposed to the old. You don't want your site to continue to link to a page that just gets 301 redirected by the server. That is just good practice. Hope this helps, Don

    Technical SEO Issues | | donford
    0

  • Lessons: Sometimes you need more than 1 landing page even if it's for 1 service and even if you only serve 1 neighborhood. Why? If you know your target audience and it's mixed it will be more likely to convert with more personal landing page rather than a general message for all of them. Don't be afraid to use different CTAs and more than 2 times. Again, it might vary, but if your page is well designed and has some good flow you can use CTAs (I had "pre-conversion" CTA to get emails and also conversion CTA). For some people it's enough to read about your brand and they will convert, for others - it's important to know how you do it - then they convert/pre-convert. Having CTA in front of their nose helps, but of course don't overdo it. Mobile first. Always. 60-90% of all conversions came from mobile. CTR is higher, and CPC is lower. I don't know why, but it happened many times with different local businesses.

    Local Website Optimization | | lovemozforever
    4

  • It sounds to me like you need to do a Content Audit with the goal of pruning out all pages with zero traffic and zero links from the index. See the following resources: How To Do a Content Audit - Step by Step (Moz) Classic Content Audit Articles (Linked-In) Content Audit Case Studies (Linked-In) Using URL Profiler for Content Audits ( URL Profiler) Here's the slide deck for a presentation I gave last year about them. Here's a recording of a webinar with the presentation above. Common Content Audit Strategies I think the site you described would be considered Extra Large with a Penalty "Risk" so this is what the tool recommends: "Focus: Prioritization of Pages for Content Optimization. Are you SURE there is no content-based penalty risk? Most sites with this many pages have major content issues, meaning this would be the wrong “situation” for them. In the rare case that they don’t, prioritize the pages based on rankings, traffic potential, revenue… and propose how many to improve each month with ongoing copywriting and on-page optimization."

    Intermediate & Advanced SEO | | Everett
    0

  • Thank you very much, I really appreciate it. appreciate it

    Moz Local | | 230893
    0

  • Hey Eric, Appreciate the positive attitude! This is definitely a good jumping off point for my team and I. I only say that because pharma is heavily regulated so things like link building and branded social media are almost always disapproved. And content updates go through a lengthy legal process before they can be published. We're already promoting the pages by using more internal links to Side A. But I still definitely like the attitude adjustment!

    Intermediate & Advanced SEO | | GTO_Pharma_SEO
    0

  • The original poster's situation sounds like merging two sites rather than moving a site from one domain name to another. Matt Cutts specifically recommended against using the Change of Address tool for merging sites in this video: https://www.youtube.com/watch?v=s6pyAWJ5BRs

    Intermediate & Advanced SEO | | ABullis
    0