Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Just because Google is choosing not to display your titles and descriptions doesn't mean it hasn't indexed them. It's content and in the source code. It's indexed. It's being taken under consideration for ranking purposes. Google ultimately controls what is displayed, however. Titles and meta descriptions are merely a polite suggestion. There is no way to opt out of those rewrites 100% of the time. NOODP is only good for opting out of Dmoz descriptions, if you even have one there at all. (Most newer sites probably won't since inclusion hasn't been relevant for at least 5 years, probably longer.) It's not the only alternative snippet source. It's much more likely that Google will just find relevant text from the page itself to populate a description snippet, and NOODP will not block that. If Google is choosing its own snippet over mine, I don't always worry about it unless it's choosing something that's just flat out bizarre. I also like to look at Google's rewrites as instructive. The algorithm is serving up something that it thinks is more relevant to your query, and these days, with RankBrain and the entire menagerie in play, it's getting pretty smart. So if Google is rewriting my description, which means that Google has decided that particular text is relevant, then I'm going to take notes on what text it's grabbing and seriously consider adjusting my own description to be a closer match to whatever Google is inclined to serve up. If Google is literally pointing out the most relevant text for that query, maybe we should listen, right?

    Search Engine Trends | | BradsDeals
    0

  • Hi Alex, Sounds like you've already got the nav changes under control which is great! Technically DA is based on your link profile so that won't change, though if you mean just the overall strength of the site, it will drop away from the SERPs once the 301 to the subfolder is in place. In case you haven't already thought of it too, once that redirect is in place, consider reaching out to any valuable referring domains pointing to .ae and ask if they can update the link to .com/ae since the site has moved. This is some easy low-hanging fruit and a great way to drive links directly to subpages as I'd mentioned doing earlier. You will retain most of the strength from these links using the 301 anyway but why sacrifice that 10-20% loss in strength if the referring sites are happy to update a single href?

    International Issues | | ChrisAshton
    0

  • Well, shoot. Apparently, I'm not quite is familiar with UTF-8 characters and that distinction as I thought. Totally my mistake. The line for our tools falls closer to english character sets. Even seemingly small modifiers like accents can throw off our ability to accurately count or detect matches on pages. So, when it comes to Hebrew characters, we simply don't have a good way to handle it.

    Link Explorer | | JordanRailsback
    0

  • Hi Hemant, Unfortunately for you, Moz's spam score is doing its job very well here - there is a lot of spam in your backlink profile. Looking at your list of referring domains, there are a lot of .ru, .in. .jp links etc which is strange for a US dentist, particularly when they're not in English. The anchor profile is a pretty good indication of spam here as well: no mention of the company name and in the top 20 most used anchors, only 1 of those isn't a blatant keyword. The other signal I noticed that's often an indication of spam is the fact that you went from 25 links to 411 in a couple of days. Basically, the best way to lower your spam score is to lower your spam Quality links should be relevant to what you do and from high quality websites.

    Local Strategy | | ChrisAshton
    0

  • Canonicalization is a perfectly acceptable way to deal with this, particularly if you only have 3 or 4 of these pages. On the other hand, if you have quite a few, I'd be more inclined to block them via Robots.txt to save some crawl budget. For what you're looking to do here, the outcome is essentially the same, one just stops Google spending valuable resources crawling those pages unnecessarily. For more info, there is some helpful discussion and further reading in this Q&A Post if you're interested.

    Paid Search Marketing | | ChrisAshton
    0

  • Logan is correct, whenever you use a 301 redirect from one page on your domain to the same domain the link equity is passed (all of it, 100 percent). So, migrating from http to https isn't going to hurt at all. You won't lose any link equity. I still prefer to updated any links on other sites whenever I can, such as links from social media profiles, etc. and any other links I can get updated.

    Technical SEO Issues | | GlobeRunner
    0

  • Hi, Canonical tags are used when there is the same content in different locations of your website. For example, If you have a blog post that is also available in a printable version, you would want to put rel=canonical tag that points to the original piece of content (to tell search engines that you want them to show that original piece and forget about the printable version). Please read this guide from Moz. It will explain everything you need to about canonicalization. I hope this helps!

    Intermediate & Advanced SEO | | solvid
    0

  • Hi Faisal, On the local part of your question: yes, if you want to target 3 different cities, the most common proceeding would be to create a really strong unique landing page for each city. This blog post may help: https://moz.com/blog/overcoming-your-fear-of-local-landing-pages

    Intermediate & Advanced SEO | | MiriamEllis
    0

  • I always find these discussions interesting mainly because of the "nobody knows" factor. What I want to point out more than any one particular thing is that there are a limited number of 'ranking factors.' Whether that number is 200, 50, 3, or 3000, it's limited. Whenever one tactic loses its effectiveness simple math says that other factors increased in importance. Google is very, very good at messing with SEOs. If keyword density is limited or removed, something took its place. Something that isn't "create quality content."  Most serious competitors are creating high quality, relevant content. If you think the difference in ranking one bank over another is "quality content" on a target page for "home loan" you're lying to yourself. Backlinks, age of the page, age of the site, internal links, anchor text, and some level of "keyword density" (though I think it's much more sophisticated than that" definitely helps. H1s still matter, as do H2s and H3s, tbh. I have competitive keywords ranking with NO on-page content AND no backlinks. The page literally has a title, H1 tag and the surrounding menus & sidebars. It ranks for gambling-related keyphrases in a supposedly hard to rank niche and has ranked for months (with zero on-page text.) SEMRush shows that same site rank for over 1200 keyphrases. It has ONE backlink to the homepage. That's it. I wish that was the only example. But I am ranking semi-competitive marketing & SEO related keyphrases on a site with about 8 links and virtually no content. If content + links = SEO, these would never rank. So again, it's beyond that. Age of the domain? No, one is brand new. One is older. One is registered for more than a year, one for less. One is an EMD, one is not. We've had new clients struggle & struggle to rank for really easy keyphrases with no backlink spam, technical on-site looks good and titles/content/links are all in line with other (ranking) clients. We put all their content on a new domain & it ranks just fine. NO links. SEO is just weird. Let's face it - we're all attempting to do the best we can for clients but at the end of the day, none of it truly makes that much sense.

    Intermediate & Advanced SEO | | MattAntonino
    4

  • Hi Tom, this will likely impact your mobile rankings, as the smartphone crawler is the one which primarily feeds the mobile index. Best practice would say to make the mobile content as close to the desktop version as possible, but I understand that sometimes the resources are simply not available. My concern would be that if you remove all internal links to the pages, the crawler will no longer find that page in new crawls and may eventually remove it from the mobile index, or may discount the importance of the page. Of course, if it isn't mobile friendly, you may end up losing rankings anyway if competitors create a more mobile-friendly experience. A couple thoughts: is the page getting valuable traffic from that position 1 ranking? If so, I'd strongly suggest including it on your list of pages to make mobile-friendly. If not, perhaps it doesn't matter if it loses the high ranking. you could create a sitemap which includes those pages, so that the crawler can discover them even if they're not linked to internally. I don't know that this will prevent a negative impact but it could aid discovery. Hope that helps.

    Intermediate & Advanced SEO | | bridget.randolph
    0

  • Hi Dmytro, That's precisely my concern. I got in touch with the owner of the website linking to us and asked him to nofollow our link. I'll monitor and see if there are any noticeable changes. Cheers J

    Intermediate & Advanced SEO | | vcj
    0

  • Thanks for the feedback, and I'm leaning towards the archiving mentioned. I have a follow-up on that though...in this high school football example, usually several times a year, we'll get a flood of traffic from a former student that did something "big" in college or pros.  Since we have articles that rank well for his name, due to coverage of him/her back in highschool, we'll often double/triple our normal monthly pageviews in a situation where the student receives national attention. If I archive the articles (many 4-5 years old), then I'm assuming we'll lose the rankings for that former student's name, and therefore lose these burst of traffic we've seen in the past. Thoughts?

    Local Website Optimization | | YourMark.com
    0

  • And since I can now stop being cryptic, I was referring to the new Keyword Explorer.

    Technical SEO Issues | | MattRoney
    0

  • Thanks, yeah I'm pretty sure national SEO is just to work on organic, but never hurts to ask and be 100% sure. I've already asked one way, a while back, but really boils down to since we don't have physical locations and those drop shipping places are just affiliate services we can access, doesn't warrant the same local search options as say a Burger King or Walmart having those national physical listings. Thanks anyways

    Local Listings | | Deacyde
    0

  • In my opinion: No, you should not. The result will be that those URLs are indexed but with a standard description to the effect that google knows the URL exists but not what is behind it; titles may be taken from links or past data. Canonical is the way to go. Regards Nico

    Intermediate & Advanced SEO | | netzkern_AG
    0

  • Typically no, LinkedIN decides whether or not the link is nofollow or not. But, if you share the article, and you put the URL in your description, it will be a "do follow" link.

    Link Building | | GlobeRunner
    0

  • Hi, My suggestion is entirely different what above mentioned. Create an adgroup with BMM keywords with at least 2 and max 3 words and let it run for 2-3 weeks. If you will do this you will get new longtail keywords that you will not get in any keyword tool. **I have tried and tested this method several times in my campaign and I got several new long tail keywords with improvement in goal conversions. Thanks

    Keyword Research | | Alick300
    0