Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Local Website Optimization

Considering local SEO and its impact on your website? Discuss website optimization for local SEO.


  • When you say page rank, do you mean Google's Page rank or do you mean Moz's page authority, from Open Site Explorer? If you mean Moz's page authority, that measure should go up, though I don't know how long that would take. You might need to wait till there is another crawl.

    | Linda-Vassily
    0

  • Simple google for "title length for best seo" would show that after about 55/58 chars your title tag drops off...and I noted that your own is 69 chars...that's "old" thinking. Search here at moz for title length etc and I remember that there are a couple of great blog posts on just how to craft same...

    | JVRudnick
    0

  • Hi Everyone, I recently asked this question http://moz.com/community/q/will-a-geo-localization-site-create-thousands-of-duplicates and is somewhat related to this thread. If anyone could help me out answering this question, that would be great! Thanks in advance!

    | Ideas-Money-Art
    0

  • I agree with the first two responses.  The most important header tag is the H1 and it would be wise to put that as the very beginning tag with your main keyword in it. The other tags are important but not as much and the order of the h2 and h3 isn't that important, at least from what I have experienced. Sincerely, Garret

    | eWebify
    0

  • ps - any idea how to add schema for the regions served ? for example my clients business address is in are A but their customer and regions they serve are in areas B, C & D if you see what i mean and really want to target local search (in part by applying region specific schema) for these regions not just the company address address Any advice much appreciated ? all best dan

    | Dan-Lawrence
    0

  • The site is several years old. I do not think the keywords are highly relevant to the discussion. It could be any word in the world and I would want a general idea of why it was happening

    | Atomicx
    0

  • While the location information has the potential to be very helpful in making business decisions, this data is far from perfect. Google still primarily accomplishes location mapping using IP addresses (specific) and ranges. If the information is not available, a location (not set) will be reported. This simply means that the information was not captured due to not having an IP address to work with. Looking at 100,000+ visits in one of my web properties over the past 30 days we are running between 3-4% location not set. In addition, there are major issues associated with accuracy for mobile devices. Google says the following on their official page: "For web visitors, Location is derived from mapping IP addresses to geographic locations. City location may not be accurate for visits from mobile devices. You can read more about it here: https://support.google.com/analytics/answer/1144408?hl=en Hope that helps!

    | davidangotti
    0

  • It doesnt really matter....for now. Obviously, you'll need to pay more and give more information to get the better certificates so THAT might have an effect in the future. Who knows when. I'm currently moving some of our sites to different certs to test them in the coming year. Getting the greenbar, say like Yoast's site or Paypal and many others, it does give you a more secure feel. For regular browsers, it could mean that it's safer, maybe it could lead to better engagement/conversions

    | DennisSeymour
    0

  • Yes. The layout markup should not mess things up. Just put the whole street address within itemprop="streetAddress", and you should be fine.

    | justin-brock
    0

  • This makes sense, and is a good way of framing it. Thanks very much. Your answer here made me see that my two tests (Indianapolis and Rossville) actually showed somewhat different algorithm principles. I understand that with the increase of mobile and thus 'conversational' voice searches, the inclusion of a place name is less and less common. Thus with the 'Rossville' example, since 'Rossville' is ambiguous and was not differentiated from other Rossvilles I can see how others might creep in. Even so, I would think Google would be programmed to first see that my location is set in Rossville, IN, and thus conclude that Rossville, IN must be the one I'm referring to. If every search was done on mobile, then I can maybe understand seeing Rossville, PA, and Rossville, GA in the SERPs. But even then, not in position 1 and 2 before Rossville, IN, where I am located... So, when I specified a very unambiguous place name (Indianapolis), while my location is set to that same unambiguous place (Indianapolis, IN), would Google's algos look outside of Indianapolis, like it did with Rossville? It turns out the inverse process is happening here (I think). I went back to look at the results for 'foundation repair indianapolis' and found that the listings were extra-localized, starting with businesses that have an indianapolis address, and moving concentrically outward from there. But again, we rank highly when location is set to Indianapolis, IN, and simply search 'foundation repair'. Apparently in this case, when a search string does not specify disambiguated place-names, Google produces items related to {foundation repair} in the general vicinity of {indianapolis}, based on the inferred location data, instead of the other approach which yields limited results within the city. This is surprising to me (though beneficial to us). I'm probably constructing too detailed of a process here based on just a couple small tests. I'd love any other input. And sorry for the novel!! I'm trying to work all this out. It's an interesting discussion though. I hope it's helpful to someone in the forums.

    | clearlyseo
    0

  • Marcus...that should'a been "...the ever DEPENDABLE Phil Rozek..."

    | JVRudnick
    0

  • Hi, If this proxy website is a clone of the original website and if its not blocked thoroughly from being accessed and found by the search engines, we have a huge problem and this proxy website should be taken down ASAP as this might create issues because of duplicate content especially if the original website is fairly new and the proxy website has some strong SEO factors like domain age, domain authority or page authority etc. Please do write back in case you would like to give some more details about the issue or have queries in this regard. Best regards, Devanur Rafi

    | Devanur-Rafi
    0

  • Hi guys thanks for your responses. You've confirmed what I suspected, especially the implications concerning ccTLD's. There are none. To be honest, it's more about being tidy and organised, and managing a domain's web presence in a systematic way, and not getting bogged down in other digital areas. If one gets their website techincally organised in a way that complies with the search guidelines, i.e redirects put where they should be, canonical urls correct, broekn links repaired, duplicate urls and pages removed and/or redirected, then I foresee fewer problems for SME's on the digital journey. Cheers

    | Stewart_SEO
    1

  • Backlinks are what's hurting you. According to AHREFs, you've got 52 backlinks coming from paces like article directories, Russian comment spam, etc. Get as many bad backlinks removed as you can, disavow what you can't get removed. But make sure you're building new links - quality links - at the same time. You need new links to improve rankings, getting rid of the bad links isn't enough.

    | Kingof5
    0

  • Great answer Sha. I will post the outcome when changes occur.

    | Elchanan
    0

  • Hey Ricky, Such a good question. As you can probably guess, the proviso here is that you can build all of these pages ... if they have genuine value. To be honest, most sites you see taking this approach are building pages that don't really have a purpose other than attempting to gain rankings for a laundry list of terms, right? After all, having a slip and fall accident in Miami isn't really much different than having one in Orlando. In the real world - it isn't different. But trying to please the bots can cause business owners and marketers to get into some pretty complicated contortions. My honest preference is to have 1 page per service and 1 page per location and depend on building the authority of the brand so that the site is strong enough to rank for many terms. I think the Moz blog is an awfully good example of this. Search for a topic, and up comes a Moz blog post, and not because we've got "For Seattle, For San Francisco, For Boston" in the titles. The strength of the site, overall, helps with ranking for topics. True, Moz is not a local business, but the concept is the same to me - building the kind of authority that makes whatever you publish on your site seem important/relevant to Google for its topic. So, like I've said, 1 page per service and 1 per location would be my ideal preference. That being said, the legal industry is cutthroat in major cities, and sometimes a Local SEO will find himself doing things that may not seem very sensible, just trying to keep ahead of the competition. If, in auditing your unique competitive scenario, you do feel you have to build a page for every service/city combo, I guess the best you can do is to try to make those pages as unique and helpful as possible. If you're the copywriter, I don't envy you, but it can be done with enough resources and creativity.

    | MiriamEllis
    0

  • Hello, Try this plugin, it will help you to migrate your Joomla! site to Wordpress and retain almost everything... http://www.fredericgilles.net/fg-joomla-to-wordpress/ Cheers Arnold

    | arnoldwender
    0

  • Hi Surfsup, I agree you should concentrate on one domain and agree that it should be the older one. Before deciding on whether to redirect or not, I recommend you profile the links to the site. There might be some good ones that would, as you say, give your elder site a needed boost. If that's the case, you should consider redirects or asking the source to point to your older site directly so the entire link value transfers and can be sustained over time. If you have bad incoming links, then you'll have to weigh the decision to disavow them before redirecting or not bothering to redirect at all.

    | DonnaDuncan
    0

  • I'll  2nd what Martijn is saying, Alexa can be hugely inaccurate, and the smaller your site is (traffic wise) the more inaccurate it is.  If you have Alexa installed on your browser then you are boosting the stats on alexa as you spend more time on your site than anyone else. I only use Alexa an odd time to get a very rough idea of competitors traffic.

    | PaddyDisplays
    0