Latest Questions
Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!
-
Advanced SEO - Locations vs Service Areas
Hello Larry, Thank you for bringing your question to the forum. I'd like to address this in two parts: I'm honestly concerned that a Google support rep has given you bad advice. This does happen. P.O. boxes are not considered valid addresses and are expressly cited as ineligible by Google's guidelines which read: "Use a precise, accurate address and/or service area to describe your business location. P.O. boxes or mailboxes located at remote locations are not acceptable." So, unless I'm missing some aspect of your business model, what the Google rep told you is incorrect. As I've said, this sort of thing does happen, and I'm concerned that your listings could be removed at any time for a guideline violation and this could even possibly prejudice Google against your account, in general, including its legitimate listing in Lincoln. So, I think this is the biggest problem you face right now and in your shoes, if you want a second opinion I would do the following. Go to Google's Forum and title your thread: Gold Product Experts, Did a Google Rep Steer Me Wrong? Then, in the post, share the information about your business listings and repeat the advice that was given to you. Gold Product Experts are Google's top-level forum volunteers and the best source of an educated second opinion when something Google says seems off. I strongly guess they will point to the guidelines I've indicated and that you will end up removing these listings as ineligible. You're not alone in facing some confusion surrounding this, and I believe a read-through of my recent Moz blog post on the topic of wanting to rank beyond your physical location could help you level up your strategy regarding this common challenge. See: https://moz.com/blog/rank-beyond-location Your second question regarding hiding/showing your address is the center of a discussion that's been going on for many years in the local search marketing industry. On the surface of things, hiding your address is not supposed to negatively impact your ability to get your listing ranking well. However, I can personally say that I have seen enough evidence to the contrary to convince me that hiding your address can, indeed, negatively impact your rankings. There are some local SEOs who strongly believe that doing so can even doom the listing. So, know that your question is valid, but that the topic is controversial, with experts frequently stating that hidden addresses do negatively impact the ranking scenario. I hope this is helpful. If you have any follow-up questions, I hope you'll ask!
Local Listings | | MiriamEllis1 -
Prevent Rodger Bot for crwaling pagination
Robots.TXT Rules If you have architecture like: site.com/blog/post/page/1 Then use: User-agent: rogerbot Disallow: /page/ If you have architecture like: site.com/blog/post?p=1 Then use: User-agent: rogerbot Disallow: /*?p= If you have architecture like: site.com/blog/post?page=1 Then use: User-agent: rogerbot Disallow: /*?page= That should pretty much stop Rogerbot from crawling paginated content. It would certainly stop Googlebot, but I don't quite know if Rogerbot respects the "*" wildcard like Googlebot does. Give it a try, see what happens Don't worry, in the robots.txt file only "*" is respected as a wildcard, so you won't have any problems with "?" and there won't be any need for an escape character
Getting Started | | effectdigital0 -
Will page be marked as 404 if you replace country specific letters from url?
Not a problem. Just remember that web user's browsers, Google's crawlers and SEO-tool crawlers all tend to react to certain factors a little differently Never lose sight on pleasing Google first. Never lose sight that Google wants to please users the most (as without them, all their ad-revenue disappears)
On-Page / Site Optimization | | effectdigital0 -
Multiple Locations Same City
Hi Waquid! Thank you so much for adding a bit more context to your question. I perfectly see your scenario now. So, if a business does something like landscape design and has only two offices in San Francisco, it's likely that the homepage and both location landing pages will include references to "San Francisco" and "Landscape Design". The landing pages could also be optimized for hyperlocal terms like "North Beach" or "Glen Park" if they are in different neighborhoods. However, if the business has, lets say, 20 offices in California, then they wouldn't be likely to use any city or neighborhood terms on the homepage because there are simply too many cities to cover. Rather, the homepage might reference regional names, like SF Bay Area, or Central Valley or Orange County, etc, or even just Northern California/Southern California. If the service is identical at all locations, then there's really no avoiding using those service keywords on all pages. You can vary them in any way that keyword research shows you variants. For example, you could dice up findings like "sustainable landscape design, native landscape design, commercial landscape design" etc, between the pages, but if you have just one overarching service, then it will be reflected on all pages. It's the geo-terms that need to be parsed up to fit the scenario of your various offices.
Local Website Optimization | | MiriamEllis0 -
301 Old domain with HTTPS to new domain with HTTPS
So i figured out another way. I did a 301 at the registrar level directly to the https. this ensures every version from that domain (http, https, www or non www) all go to the same URL with https without 2 or 3 redirects.
Intermediate & Advanced SEO | | waqid0 -
Competitor Inverse Relationship
Wouldn't be possible to draw any kinds of conclusions with such top-line data. From this post, we don't know what the keywords are or which websites were involved in these movements. We'd want to be looking at actual keywords, looking in the WayBack machine to see how content on both sites changed, looking in Ahrefs to see if there are any matching link trends for either site Alexa score is ancient I wouldn't be looking at it any more to be honest. Regardless, it's not possible to check for black-hat attacks on "purple line" or "blue line", we need domains here! If you want a comprehensive audit of exactly what happened, no one can supply it for 'mystery' websites based on a couple of charts You're also looking at things in a very binary way. How do you know they didn't do something good **when **you screwed something up? Why does it have to be one or the other? In SEO usually there are a convergence of factors surrounding such large movements!
White Hat / Black Hat SEO | | effectdigital1 -
Vertical pip in HTML
What are you trying to achieve, what are you using the pipe symbol for?
Technical SEO Issues | | jasongmcmahon1 -
Fake dmca :(
Give more details of your site and the allegations against you
Technical SEO Issues | | effectdigital1 -
Link explorer section showing different linking domains numbers
Hi there, Sam from Moz's Help Team here! So, it's not necessarily that one tool is more accurate than the other but rather that one tool may have more data at the moment. In regards to Link Explorer (and the data shown in Mozbar), newly discovered links have the ability to be populated into our index in about 1-3 days, however there are a lot of factors which can affect our ability to find and index links to your site. It's important to note that we add new data to our index everyday but it may take some time for us to discover backlinks to your site based on factors like crawlability of the referring pages, quality of the links and the referring pages, and more. If you are not seeing links or linking domains that you know you have, you may want to make sure that they can be indexed. It is also a good idea to check to see if we've indexed the page on which that link is found. If we haven't indexed the referring page yet, you won't see your link in our index. You can also add links to Link Tracking Lists. Once you add a link to your tracking lists we will add that page to be crawled. As long as it is accessible to our crawler, you should see the link in our index as soon as we can index those pages. I hope this helps — let me know if you have any further questions!
Link Explorer | | samantha.chapman0 -
Moz spam score 16 for some pages - Never a manual penalty: Disavow needed?
If they contain anchor text or are obviously trying to game the system by boosting rankings you definitely want to remove them, but you need to have a good eye for these things. just because spam score is high doesn't always mean its spammy. look into why moz thinks its spammy. Give it the smell and look test. if it looks fishy, then its fishy. If you only have your citation listed with a www link you don't have much to worry about.
White Hat / Black Hat SEO | | waqid0 -
Link Structure June 2019
We agree with Gaston. Our experience is that it almost is never worth the time to change URL structures on an established site. Instead, focus time and effort on creating more and better content, adding more internal links, and doing legit link building. For example, we looked at this blog post, https://www.fishingtackleshop.com.au/blog/chasebaits-lure-range/, where you highlight four separate lures but you only link to one of the product pages instead of all four. Furthermore, you linked only once using a "Shop Now" button. We'd recommend adding another link to that product page linking the text of the name of the product somewhere above the "Shop Now" button within the content talking about that specific lure.
Intermediate & Advanced SEO | | Nozzle1 -
Best less expensive graphic design
Hi Bob, Have you looked into 99 Designs? I find the graphic designers on this platform are at a higher caliber than Fiverr. My favourite feature is that you can start a contest. So you'll brief the community on your concept and designers will submit their designs so you can pick your favourite. Hope this helps, good luck. https://99designs.ca
Web Design | | The.Mindfulness.Marketing2 -
Wondering if anybody actually uses paper.li?
Thanks, I'm looking for some examples of effective use but I haven't been able to come up with anything.
Content & Blogging | | Brando161 -
How to get rid of Google's manual action penalty on spammy schema markup?
Yes, I have submitted home page to google for indexing and it is indexed. Also, I have sent reconsideration request twice after removing all the schema codes from the website but it got rejected.
On-Page / Site Optimization | | Ananya_Ramje0 -
Is there a way to forward banklink benefits from one domain to another without a redirect?
In this case where I'm unable to do any sort of 301 is there any other in-page options that might be a reliable way to forward link equity? The other option is that I keep pressing to change the domain of the login page to a subdomain of the marketing site, which is unlikely at this point, but even in that case the subdomain would cause issues with link equity correct?
Intermediate & Advanced SEO | | OCN0 -
Find 28 pages (status code 200) on MOZ Pro but 8 on Screaming Frog SEO Spider 11.3
Eight!? That sucks Try my crawl file: https://d.pr/f/KTsX5i.seospider (crawl file, hosted on my Droplr) When you load an SF crawl file directly, it remembers all the settings it was run with (handy huh?) So just make copies of my file and re-run it from there. Problem solved. No need for any questions
Moz Tools | | effectdigital0