Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi Daniel! Thanks for your question. It's kind of hard to know what's going on without seeing your site. Feel free to PM it to me. There's definitely a chance that this is the case, but if it's happening with Yoast it is likely a configuration issue on your site not with Yoast's technology. You may need to adjust your tag permalinks within your WordPress admin so that the URLs are correct in your sitemaps. John

    Technical SEO Issues | | dohertyjf
    0

  • Hi Marie, Thanks for such detailed answer. Actually I have recently disavow a website from which 74 links are pointing to our website. So that domain must be crawled post my disavow for the impact. Right? I wanna share our experience with related to back-links. Our global website has visitors around the globe. We been inversely ranking in India and US when we reclaimed and removed links from few domains. In more detail: We dropped in India and Improved our ranking in US when we reclaimed some back links and we dropped in US and improved in India while we removed those redirects. So it clearly mean that one or few links can trigger the algorithm that our ranking fluctuated for more than 15 positions. So such few or more suspicious back-links might be pushing us down. In this scenario, we can use Disavow undoubtedly if we know such links. We never received any manual action but not sure about manual penalty or silent penalty, but definitely we dropped post recent Penguin update where some of the back-links are real culprits. Hope I shared some useful information to you. Thanks, Satish

    Intermediate & Advanced SEO | | vtmoz
    0

  • I would argue that Amazon.com is probably one of the best sites on the web for SEO and they use Meta Keywords on every page. I have used them on our sites and seen no negative impacts either.

    Technical SEO Issues | | roundbrix
    0

  • Hi, I'm using a data-driven attribution model with 12 channels grouping. I want to know if I increase Google display for 1.4 and I see that I have changed in the weighted conversion from 2 to 4 for e.g. What's that mean, how much more conversion I will have?

    Conversion Rate Optimization | | Tormar
    0

  • I've used Big Commerce, Shopify and WooCommerce across a few clients over the years -- in my opinion, Shopify is the way to go unless you need lots of customizations. Big Commerce's template system is not as intuitive as Shopify (liquid) and the third-party apps are of lower quality too as well. There is a ton of content out there on Shopify and lots of third party support (as well as for WooCommerce). Speed wise, I think Shopify is a good platform. My ecommerce client is getting 70+ scores on PageSpeed Insights which we could probably cut if desired by removing some product images on the homepage if desired.

    Online Marketing Tools | | conradoconnell
    0

  • Hi there, I hate to say this, because it's not a great answer, but it really does depend on a bunch of things. Generating links naturally i.e. links as a result of someone linking to you without you asking, is generally quite difficult unless you have a very, very good piece of content. But when this does happen, I've seen content pieces get anything from 10 - 10,000 links with the majority of them being natural. On the other hand, I've worked on content pieces where I've had to work really hard to get 10 links using outreach. Sometimes, you work really hard and get nothing! If you're just starting out on this path, you're unlikely to get links naturally from day 1 and will need to work hard to get links via outreach. But you should find that over time, it becomes easier and eventually you get to a point where you're not having to scrap for every single link. It isn't easy, but certainly possible. I hope that helps a bit! Paddy

    Intermediate & Advanced SEO | | Paddy_Moogan
    1

  • If we are talking about content in a table as a means of representing data. I believe Google won't care too much about how your content in presented, however, what will matter is how your users react to the layout. As a rule of thumb, always go with what is best for the user. Ask yourself what is visually more appealing and easier to get your point across.

    Intermediate & Advanced SEO | | thmsmrrtt
    0

  • Hi Sam Wilhoit, Sorry to hear you have lost out rankings to a competitor.  It is difficult to say with absolute certainty why they would outrank you all of the sudden without knowing who the competitor is and on what keywords this is occurring.  Feel free to message me and I can take a quick look for you. In the mean time, did you change anything on your website recently, or even in the past 3 months?  Sometimes a change, no matter how small, might have an adverse impact on your website and rankings. Has your competitor changed anything in at all on their site that you can tell in the last 1-3 months?  Have you been tracking them with any tool in order to determine this?

    On-Page / Site Optimization | | SurgeStream
    0

  • TL;DR: You're right to be skeptical that this is an urgent issue (in my opinion), but it is something worth fixing at some point for several reasons. I was far more concerned by search results, but I see you've added those to noindex/disallow in robots.txt, which is great. Not many people know that works! I think it's very possible that Google understands the difference between a classified ad and an editorial content piece. They definitely treat products and content differently. That said, it's generally a good idea to avoid relying on Google's intelligence, as many have been let down by Google's failure to understand. Duplicate content is generally something SEOs are overly-concerned with. More often than not it triggers a filter - not a "penalty." I don't see it as the most dangerous thing you could be doing by any stretch of the imagination. That said, I've seen several classified sites do the following, which I'd recommend as a "best practice" approach. At one time Craigslist did this, and may still be doing it. Accept non-spam ads with a pending status Check against listings in a given period of time for duplicates. This happens even if the ad is changed slightly, so there's some kind of semantic+image analysis going on. If a duplicate is found under the same user name, inform them that they've already posted the ad. From here the rules are up to you. Many sites say the ad can't be posted again for 7 days (if the old ad is deleted) or 30 days (if not). They then encourage users to buy a featured listing that shows up higher than others. If duplicates are found under different user names, give a warning that it's against your terms of service (make sure it is) to post duplicate ads from multiple accounts, that accounts can be banned, and have them certify the post is not the same. You don't need to follow this exactly, but it's here to give you some ideas on having your users prevent duplicate content for you. Given the general positive architecture I've seen on the site it looks like you know what to do with the site better than I would. Now I don't think 250 out of 10k is bad. Having consulted with a few local classified sites that's actually quite low. But I do think there's something to be gained by detecting duplicates to prevent users from gaining an unfair advantage over those playing by the rules. And if you sell featured listings this is an excellent way to help those who are most desparate to sell while increasing revenue. I hope that helps. Obligatory disclaimer: This is merely free advice for your consideration, and not the Moz official stance. The consequences of any changes you do or don't make are ultimately your responsibility.

    Intermediate & Advanced SEO | | Carson-Ward
    1

  • Hi Allie McFadyen, I am not familiar with using express.js and Angular, but since it is JavaScript based then historically Google has had issues with JS in the past.  Now, Google says they can now crawl JS, but still there are issues that might arise that can hinder your sites SEO if not done properly. It seems like the test you have done has shown that it is not crawlable and viewable, but have you tried creating your own test page and then using the "Fetch as Google" tool within Google Search Console? Even if the test comes back and it is crawlable, I would always be cautious using a technology that has questionable SEO results.  To this day I am still cautious using JavaScript in certain sections of a website, even though Google says they can crawl it. Hope this helps. Let me know if you have any questions. Regards, Kevin

    Technical SEO Issues | | SurgeStream
    2

  • Hello there! Sam from Moz's Help Team here! Here is how you can turn off your Moz Local auto-renew in 3 steps: Head to your Moz Local dashboard and click the "Subscriptions" tab on the side nav bar https://moz.com/local/bulk/managed/subscriptions Click on the subscription level your listing is under Check all tabs for additional listings https://screencast.com/t/TryyRQbWFtwe In the Options dropdown to the right of your listing, hit "Disable Auto-renewal" https://screencast.com/t/n7VNCEfb If you decide you'd like your listing to renew again, all you'll need to do is choose "Enable Auto-renewal" from your Options dropdown. Here is a quick video of this process. You can also read more on our Help Hub. I hope this helps but please let me know if there's more I can assist with!

    Moz Local | | samantha.chapman
    1

  • While I agree with EGOL, I'd also do two more things: Check to see if the same issue happens when going to the organization's site when you are at home or somewhere other than work. It could also be an issue because of your office's security settings. Regarding if Google will penalize you, no. If it's a paid listing (eg you're getting listing only because you paid) then it's a bit more of a grey area, but it's also a legit association and so I wouldn't worry about it to be honest. If you are worried about it, then you could always ask for it to be nofollowed. But as I said, I think this is a case where a followed link is totally fine and no search quality reviewer would bat an eye at it.

    Technical SEO Issues | | dohertyjf
    1

  • How much does he plan to spend on marketing to reinforce his brand?  I believe .io will rank on Google just as easily as .net .info .us etc. but how much will it take to rank in their customer's brain?  I have a .us and 7.5 years later I still have to reinforce the domain when I am talking to a client.  It's like trying to get a client to say chartreuse instead of lime green.

    White Hat / Black Hat SEO | | julie-getonthemap
    2

  • **No client wants to hear it, but it's the truth.    ** I agree 100%.   And they need to be honest with themselves about the quality of their content.  They can't sit down, type 30 minutes of yada yada yada, and think that they have something that will attract links, shares, etc.  They need to look at the Moz blog to see what type of content real marketers produce.

    Local Strategy | | EGOL
    1