Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi there! That's awesome that you are already ranking for positions 1-3 but some solid research from Stat shows that it doesn’t guarantee you would trigger a rich snippet. But you have a high chance of getting one! I cover in detail how you can find and trigger rich snippets in this blog post: https://www.distilled.net/resources/3-ways-to-find-answer-box-opportunities/ In short here are the elements that will help you trigger an answer box: Presence of search query which is a question (or an implied question)  in <title>tag</li> <li>Presence of search query in <h1></li> <li>Presence of <ol> or <ul> lists or <table></li> <li>Look at what your competitors have done to get an answer box.  How did they format their content (pay careful attention to what Google highlights in bold in the answer box)? </li> </ul> <p>Hope this helps and let me know if you would like ant clarification!</p> <p>- Maryna </p></title>

    Intermediate & Advanced SEO | | Maryna_Samokhina
    1

  • As for URL transfer Dave Nash's answer is perfect, you would get the authority for URLA.com when you 301 redirect it to URLB.com. You also need to consider following process for transferring the authority of existing pages of URLA.com to new domain URLB.com Try to map each URL of existing website(at least the important ones) with 301 redirects to new domain URL equivalent. (example : > About-us page to be mapped with About-us page on new domain ). This will transfer most of the SEO value and authority to the new domain URLs and to the right pages. Register and verify your old domain and new domain with Google Webmaster Tools. Create a custom 404 page for the old domain which suggests visiting new domain. In a development environment, test the redirects from the old domain to the new domain. Ideally, this will be a 1:1 redirect. (http://www.example-old-site.com/...to http://www.example-new-site.com/...) 301 redirect your old domain to your new domain. Submit your old sitemap to Google and Bing. The submission pages are within Google Webmaster Tools and Bing Webmaster Center (This step will make the engines crawl your old URLs, see that they are 301 redirects and change their index accordingly.) Fill out the Change of Address form in Google Webmaster Tools. Create a new sitemap and submit it to the engines. (This will tell them about any new URLs that were not present on the old domain) Wait until Google Webmaster Tools updates and fix any errors it indicates in the Diagnostics section. Monitor search engine results to make sure new domain is being properly indexed. I hope this helps, let me know via your response if you have further questions. Regards, Vijay

    Search Engine Trends | | Vijay-Gaur
    0

  • I'm afraid that I don't know of a way to change these other than finding them all and manually amending them. We do let you know the URLs of the referring pages so you know where to look - this is easier to see if you download the CSV file for these 404 errors from your crawl. I think the Broken Link Checker Wordpress plugin can also offer you a list of these broken links that you can click through directly to amend them but I don't have any personal experience with this. Sorry I can't be more help!

    Technical Support | | LisaHunt
    0

  • Hi Ben, Mike Roberts answered it very well and explained the reasons that you should buy comments and wait for organic users to comment on your post/pages. Let me add some more pointers to the same answer with my own perspective. A very well written comment can add  a lot of value to a page / post if it compliments the content and intent of the page/post it would be highly beneficial for a new visitor for getting another user's perspective / feedback about the website. Now, coming back to the original question whether buying comments can do that job. The answer most of the time is , very hard, NO. The reason is very simple, the commenters are not actual users of your website or your services and they might not even understand the reason your website is there, so their generic comments can look artificial to other users. As for SEO, I don't think google analytics analysis whether the comment was made by a person who spent 30 secs or 30 mins on the website, for Google Search and SEO the content of the comment and it's relevancy to the content on the page is more important. This purpose also gets defeated by a generic comment , it would look more like a spam to search engines than a good contribution from a natural site user. I hope this helps, if you have further question, please feel free to respond. Regards, Vijay

    Content & Blogging | | Vijay-Gaur
    0

  • Hey Beau Thanks for your reply. We never received anything from Google in Google Webmaster tools. One day the site was gone in the serp with no explanation. That was on 11 August. Since then we have done a lot of things in Google Webmaster tools to see if that would help. There were no major things, but we have done everything we can and no there are nothing left to do. Yesterday, the site as back on track in Google. I can't say why, but positive thinking (ability to look for solutions and not anybody to blame) I believe are the most important reasons why we are back. We have not liberally used anything to mark up our site with structural data before. The only thing we could find that might have helped us with structural data was a plugin Author H review which I find very good. However my intention is so structure up the entire site with structural data now.  I read the recommended article about structural data (thanks, it was really interesting). I'm no experted in it, nor is my web developer. Hence I have been thinking of contacting a Schema.org expert. What do you think, would that be a good idea? The more I read about schema.org the more it triggers my interest and the more I realize that it could be good to get some help. In google webmaster tools we still have 4 errors regarding Structural data. However when we test it in their own tools everything works. IN the control panel we have added http: https: www.http www.https. For some reason Google shows different results regarding indexed pages and structural data and site maps depending which of the http versions you're looking at. The structural data is now back and we have managed to increase it. Thanks a lot for your hel Beau. Your questions and answers helped us to start looking more at structural data. It was a great relief yesterday when we were back in the search results again. I hope it will stay that way. Do you think it's worth spending time and resources to contact a schema.org consultant who can help us mark up the entire page with structural data? Have a nice day! Anders

    Intermediate & Advanced SEO | | Enigma123
    0

  • Thanks Matt for the detailed answer. I appreciate it.

    Technical SEO Issues | | PeterDavies
    0

  • Mark, thanks so much for weighing in on this. The article you provided a link for above actually is the article that I read which lead me to port our main number, after which all the craziness happened. If we could've found a way to not have all minutes double billed it would have been a great solution. In any case, my gut told that I shouldn't worry too much about a few paid listings especially since Google uses call tracking in Adwords. I just couldn't find any references to back it up except The legal directory sales guys. I just wanted some independent advice. I truly appreciate you taking the time to respond. Thank you!

    Local Listings | | SEO4leagalPA
    0

  • Honestly, maybe both? For me, it would depend on how useful those old posts are. Are they getting traffic? Are there valuable links pointing to them? If so, it may be worth it. If not, 301 redirect them to new, relevant, optimized posts. For what it's worth, I also wouldn't recommend "churning out" blog posts unless they add value for your visitors.

    On-Page / Site Optimization | | MattRoney
    0

  • I haven't heard of this service, but if you can figure out where any of the indexes grab links, then they're doing that. For Google specifically, they're likely going to post your links on G+. Links posted via G+ see a quicker rate of indexation, and that's something a 3rd party could do without having access to your backend, changing code, or fixing your website.

    Link Building | | EricaMcGillivray
    0

  • Hi Jason! I agree 100% with what Miriam and Ben have said. I would go through all your listings and take out the geo-modifiers if that is not actually part of the legal company name. This could be causing confusion for Google and negatively impacting your rankings—not something you want to do! However, if Google has not caught on to this (if the company's name does not actually include the geos), and you change these, it could impact how you rank for local search results. By having the city in the company name, you increase your chances of showing up in the local pack. This is a situation where you will need to track, test, tweak, repeat until you find the right process for your particular situation! Hope this helps.

    Local Listings | | BlueCorona
    0

  • Hi there - Kristina from Moz's Help Team here. Thanks for reaching out to us and I'm sorry to hear you are experiencing issues with Mozbar. Since this is an extension of your browser, it may be difficult for us to isolate as the toolbar will rely on your personal settings in the browser. Here are somethings you can try to get it working: Make sure you are not in incognito mode Disable/un-install all other extensions leaving just Mozbar to isolate Re-install Chrome Check to see if the address bar is hiding the extension icons: http://www.screencast.com/t/7cLHKkOP Disable any software firewall or security apps As always, you can reach out to our team in the future with any product-related questions by sending an email to help@moz.com or clicking on the little blue chat icon on the lower half of your screen while in the tool. I hope this helps but please let me know if you're still needing assistance. Thank you, -Kristina

    Link Explorer | | KristinaKeyser
    1

  • Adding a meta noindex tag can mean it takes a few weeks for a page to fall out of the index. These pages probably aren't doing you much harm, so if you wanted to just wait for them to fall out, that's probably fine (although I would update the tag content to "noindex, follow" to help Google crawl to the other noindexed pages). If you really want them out of the index faster, you could use the "Remove URLs" function under Google Index in Google Search Console, which will temporarily remove them from the index while Google is registering the noindex tags, or you can use the Fetch + Render tool and then Submit URLs in Google Search Console, which will cause Google to come back and crawl your pages and find the noindex tag.

    Intermediate & Advanced SEO | | RuthBurrReedy
    0

  • Agreed with Chris, when you have a lot of pages and when your code is a little bit more complex then some basic stuff Google Search Console will have a habit of sending. What I saw in the past as well is that they pick up parts of your tracking code and try to find URL structures within the code that don't really exist but are part of it. Nothing to really worry about, if you make sure you run a monthly or quarterly crawl to check upon weird URL structures on your site and these URLs don't pop-up there you should be fine. As mentioned, just mark them as fixed so the real issues will move up again.

    Intermediate & Advanced SEO | | Martijn_Scheijbeler
    0

  • It's Great that you got the solution. Can you please close this question now, it's still showing as unanswered. Thanks, Vijay

    Intermediate & Advanced SEO | | Vijay-Gaur
    0

  • Its early for me and I haven't had enough coffee yet so I may be misreading things. Why do you have three sites on the same IP address with three separate mirrored WWW versions? As far as I can see... No reason you can't use Rel=Canonical to fix that issue though. If the code is going to be the same because of how they are mirrored then you'll have one site canonical'd to the right version and the right one canonical'd to itself... which shouldn't cause any issues. Can't guarantee that it will work in every instance though. Canonicals are a suggestion not a directive. So the bots will try to respect your tag but if they feel it is incorrect, improper, deceptive, etc. they can always choose not to.

    White Hat / Black Hat SEO | | MikeRoberts
    0

  • That part is probably configured by the Yoast plugin as it looks pretty much the same as the set-up that we have.

    Intermediate & Advanced SEO | | Martijn_Scheijbeler
    0

  • The other question is how will the "link bait" page be found by potential linkers? It won't show up in search anymore, if you point a canonical to the other page. [Maybe it is just very easy to find from another popular page on your site, which is easily findable?]

    On-Page / Site Optimization | | Linda-Vassily
    0