Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Local Website Optimization

Considering local SEO and its impact on your website? Discuss website optimization for local SEO.


  • My pleasure! And you might like to know, I have an article coming out on the Moz Blog on this topic later this month. Stay tuned

    | MiriamEllis
    1

  • I'd have to see the site to answer that question, but when this happens it generally means that users did not think your site was the best result. As you hit page one Google gets more data, including impressions, clicks, click-thru-rate, bounces, query modifications... basically, they'll know if the searchers found what they were looking for or "liked" your site. Then again, it could be something else, but I'd have to see the site. Did you make any changes? Are your redirects still working?

    | Everett
    0

  • Thank you Miriam for your response. A couple of follow up questions. What would you recommend I do about the high spam score? Disavow links with a score over a certain %?

    | Jarod4545
    1

  • Hi Matt, Apologise for the long time since the delay, I didn't get a notification or at least didn't see a notification telling me you had replied. Has anything changed? If not then let's arrange to do a call or give me access to your bing, and I will take a look. Steve

    | MrWhippy
    0

  • So, you basically can't 'force' Google to do anything but there may be better ways to encourage them to remove these URLs The only way to force Google to remove a URL is to use the URL removal tool in Google Search Console but this only removes a page temporarily and it's a pain to do en-masse submissions. As such, not my recommendation One thing to keep in mind. You have loads of pages with no-index directives on, but Google is also blocked frown crawling those pages via robots.txt. So if Google can't crawl the URLs, how can it find the no-index directives you have given? Robots.txt should be used for this - but your chronological deployment is off it's too early. You should put this on at the very, very end when Google has 'gotten the message' and de-indexed most of the URLs (makes sense, yes?) My steps would be: No-index all these URLs either with the HTML or X-Robots (HTTP header) deployment (there are multiple Meta robots deployments, if editing the page-code is gonna be difficult! Read more here) Also deploy noarchive in the same way to stop Google caching the URLs. Also deploy nosnippet to remove the snippets from Google's results for these pages, which will make them less valuable to Google in terms of ranking them For the URLs that you don't want indexed, make the page or screen obviously render content that says the page is not available right now. This one might be tricky for you as you can't do it just for Googlebot, that would be considered cloaking under some circumstances On the pages which you have no-indexed, serve status code 404 to Google only (if it's just a status code, it's not considered cloaking). So for useragent GoogleBot make the HTTP response a 404 on those URLs (temporarily available but coming back). Remember to leave the actual, physical contents of the page the same for both Googlebot and users, though If that doesn't work swap out the 404 (sent only to GoogleBot) with a 410 (status code: gone, not coming back) to be more aggressive. Note that it will then be harder to get Google to re-index these URLs later. Not impossible, but harder (so don't open with this) Once most URLs have been de-indexed and de-cached by Google, put the robots.txt rule(s) back on to stop Google crawling these URLs again Reverse all changes once you want the pages to rank (correct the page's contents, remove nosnippet, noarchive and noindex directives, correct the status code, lift the robots.txt rules etc) Most of this hinges on Google agreeing with and following 'directives'. These aren't hard orders, but the status code alterations in particular should be considered much harder signals Hope that helps

    | effectdigital
    1

  • Thank you to everyone for their input. I appreciate your feedback. I'm going to research more about link building and where to start that process. I will advise against the multi-site approach and focus efforts elsewhere.

    | Scott-Jones
    1

  • Hello There! Using a rank checking tool set to the zip code of your business, I see you ranking #1 in the local pack and #9 organically for "real estate photographer Denver". Doing the same search from my location in California, I see you ranking #2 in the local pack and #7 organically. In other words, you've already achieved really excellent visibility. I would question the path of making your homepage more optimized towards Denver, because I'm assuming Denver is only 1/50th of your customer base, if you offer a 50-state service. Right now, with your optimization exactly as it is, Google is still highly favoring your Denver location in the Denver-based-and-modified results. I would feel some concern that overdoing the homepage with Denver would be subtracting from your visibility in other parts of the country. At the same time, if you created a Denver-only page for the website and pointed your GMB listing to it, your local rankings would almost certainly drop as the new landing page wouldn't have the same DA as your homepage. Your About and Contact pages can act as landing pages, of sorts, for your Denver location, but I still wouldn't advise pointing your local business listings at them. Nor does that seem at all necessary given that you're ranking #1 and #2 locally for this core term. Rather, if what you're really hoping to do is get more business from Denver clients, I'd leave your website as-is and consider some of the following efforts: Continue to hammer down on reviews. You're already blowing away your local pack competition in this regard, but I'd keep working on this. And keep up the work on Google posts. This is for your local ranking maintenance, of course. For your organic goals, I'd work on links. Explore whatever Denver-based links and linktations you can get. This tutorial might be helpful: https://moz.com/blog/linked-unstructured-citations. I would also work on social outreach within the Denver community, which actually corresponds with point #2. Hope you'll get some more feedback from the community!

    | MiriamEllis
    1

  • Three websites = 3x the work...and budget needed. Similar to a medical clinic when it comes to services. List this company's services and optimize accordingly. If it is an incredibly competitive market, niche website may be the way to go. If I had it my way, in a medium to low competition market, i would focus on strengthen the content & DA on one site, rather than 3.

    | WebMarkets
    0

  • Abbi, Thanks so much. That is very helpful.

    | digitalmarketingneoscape
    0

  • Of course anytime. If you would like to give me other examples for a look at the site I can give you a lot more information. Glad to be of help! Tom

    | BlueprintMarketing
    1

  • Good afternoon! My ideal approach is that you go with a single website at domain.com with the location landing pages all being domain.com/chicago It's the simplest method with no drawbacks of any kind. Subdomains can get you into this messy controversy: https://www.seroundtable.com/seo-google-fight-subdomains-subdirectories-25126.html But, your scenario has an extra layer of complexity built into it, in that your client has also built out multiple domains. That makes this such an 'it depends' scenario, that it's unlikely you'll be able to get a really authoritative answer from anybody who isn't looking at your client's actual domains. So, what I can say here is that, ideally, I'd recommend a single website with all content on it and links to it going to strengthen the overall authority of the domain and brand instead of being divided up across multiple domains. I think this is the strongest approach, in terms of SEO, branding, consumer experience, ease of management, etc. But, I will say that there are some cases in which, if the client is earning significant revenue from multiple domains, it could be not-the-best idea to tell them to take those down or even re-direct them. On the other hand, sometimes the client has simply created a spammy mess, sometimes even for the purpose of fooling search engines and customers, in which case, you may need to make overhauling all of their web assets a prerequisite of working with them. Definitely, this is an "it depends".

    | MiriamEllis
    0

  • Hello Korado11, As I understand your question, you are offering a service in a large city with several hundred districts, and you are hoping to rank for your service in multiple districts. There will be several facets to the advice I can give you about this. If you are trying to rank in the organic results for these districts, your ability to do so will be based on a combination of domain authority, page authority, links, optimization, website content and user behavior. If you are trying to rank in the local results for these terms, everything in #1 applies, plus other factors like reviews and citations, but you must also take into account the role user proximity plays in local rankings. If you are physically located in district A and your potential customer is located in district B, Google simply may not show you in the local results for that user, because they are not physically near your business at the time of search. Where competition is low or weak, it is sometimes possible for a business to rank locally in multiple districts of a large city, but if competition is stiff, it will be an uphill battle to overcome Google's extreme bias towards user proximity. Whether you are hoping to rank organically or locally, your key here will be to do a competitive audit of the businesses outranking you to see how their metrics for each surmised ranking factor outweigh yours. You will tally this up with software or in a spreadsheet and this is what should set the strategy that you or any marketer you hire will enact so that your metrics can compete with and eventually surpass those of the businesses currently outranking you. *However, there may be some things you simply can't overcome, like user proximity. Where organic/local efforts fail, your alternatives are PPC, social and offline marketing to reach audiences you just can't access via SEO . I would advise you to closely study the most recent Local Search Ranking Factors survey to see which factors expert local SEOs feel have most impact on local and localized-organic results. You will want to familiarize yourself with each factor, and most of them should be in your competitive audit process. I will also link you here to a post I wrote two years ago containing a spreadsheet for a competitive local business audit, but I want to mention that there are some new factors I'd be adding to that in 2019. Nevertheless, it will give you a good idea of what an audit should look like. At the end of the above process, what you'd want to have is a list of tasks and goals, including a list of goals that aren't going to be achievable so that you don't waste time or money on pursing them with tactics that won't actually work. By studying the SERPs and your competitors, you should begin to have a strong sense of which districts you may be able to achieve a strong local and organic presence in, and which districts may be beyond the reach of SEO, making it more sensible for you to attempt to reach those desired audiences with PPC, social or offline outreach. Hope this helps!

    | MiriamEllis
    1

  • Google is probably not going to take any manual action based upon your complaint.   Keyword stuffing and hidden keywords have been around since the beginning of Google and they have algorithms that look for it.  These algorithms sometimes find it, sometimes don't find it, sometimes demote it and sometimes don't demote it. I would not spend time trying to report this type of action.  I would spend that time instead working on my website.

    | EGOL
    0

  • Good Morning! You might find this article of some use https://medium.com/@khusbu_machar/what-is-google-knowledge-graph-and-how-to-get-rid-of-wrong-data-on-knowledge-graph-c9dce02fd560 but from what I have seen, it can take Google time to sort out old references to a piece of data from more recent ones. Have you tried to retroactively edit old references to your former CEO where they exist on the top pages coming up for your branded search?

    | MiriamEllis
    0

  • Good morning and Happy New Year! Either way or both, with the internal links, is fine

    | MiriamEllis
    1

  • Hi Greg, At first sight, it seems to be an issue on where you are searching for that international sites. Keep in mind that Google, since over a year ago, takes your IP to localize our search. So, considering you're searching in the US for your UK site, it's possible that you're the issue there. Also, I'd take a deep dive at: That international configuration, it is one of the most complicated settings in SEO, In GSC, for the main site (.com) how much traffic is receiving from your other target countries. Are properly linked every site with the others? Crawling issues? It could probably be an issue that Googlebot isn't able to crawl entirely all sites. As always, when it comes to international, these resources come handy: Hreflang generator - Aleyda Solis International SEO - Moz Learning Center The Guide to International Website Expansion: Hreflang, ccTLDs, & More! - Moz Blog The International SEO Checklist - Moz Blog Best luck. Hope it helps. GR

    | GastonRiera
    0