Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Local Website Optimization

Considering local SEO and its impact on your website? Discuss website optimization for local SEO.


  • Nice answer Jane. This probably answered many questions (that I had in my mind before reading your answer) related to the best URL I can have for my website. Cheers!

    | _nitman
    1

  • Hi everyone! I'm happy to say that Moz Local is now available in the UK. You may learn more about it here: https://moz.com/blog/moz-local-uk --Kelly

    | kellyjcoop
    0

  • Given all the other geotargeting factors are fine, using a domain name whose IP is from USA would be a good added signal. Remember, you don't necessarily need to have your site hosted in the USA for having a USA IP.

    | gfiorelli1
    0

  • Hi Miriam Thanks for the response, I completely understand what you are staying and agree with you.  I always play by Google's rules, but occasionally the real world has to be considered. In this instance it is more important and more financially beneficially to the company to have the virtual office near to where all their clients are currently based. But we don't want anything to happen to the website as it is a source of reference for their customers, so we have already put in place hiding the home address and showing Google the area serviced by the client. We won't be targeting the virtual address as the business is strong enough to appear within the location they service. -Christina

    | ChristinaRadisic
    0

  • Looks like your client is part of a link network with exact match anchor text, check out these: http://cornhilldental.co.uk/index.htm http://www.modburydentalpractice.co.uk/ http://www.farcottondental.co.uk/links.html

    | TheeDigital
    0

  • Oops, above post was from me Sorry, I was logged into a different account when responding.

    | MiriamEllis
    0

  • Hi Miriam, thank you for helping! Yes, this is what i wanted to know if there is a way to somehow to mark local page with numbers of vendors on it (not individual pages with only one vendor). Thanks

    | odmsoft
    1

  • My guess is you'll see an improvement in rankings moving to a subfolder structure but a possible decrease in the # of results you get on page one (if the subdomains were ranking there previously).

    | David-Mihm
    1

  • Just want to thank Miri Offir for the question. Her detailed post and this discussion are the answer to my everything!

    | vernonmack
    1

  • Addendum to this.   You may actually want to look more at the pages you are targeting for those keywords and the amount of organic traffic going to that page.  Why?  This encompasses more than just what your ranking is, but are people clicking through.  We have run experiments to change the meta description and looked at traffic vs ranking.   While ranking and click through are related, they are still two different things. Finally, you have to look at what converts.  I like to use Google's page value parameter.  I can look at not only what content is ranking and getting traffic, but also what converts.  I then use that to better plan my content or what content needs to be updated etc.  Get your metrics to tie out to the bottom line and that is what really ultimately what matters. Cheers!

    | CleverPhD
    0

  • Key point by Rebecca, use data to make this decision.  I just 410'ed almost 800 old pages/articles from a website I help run.  They were all republished press releases that were at least 2 years old, they got less than 9 organic pageviews over the past 6 month period and no link equity.  You have to do some work with merging this data from GA and OSE, but it is worth it.  I could say that when I deleted these 800 pages I was not losing significant traffic or links and I was improving my crawl efficiency with Google and potentially a quality factor with Google as they were not having to look at crappy old content.  Another way to say this is that if users were not visiting the pages nor were they linking to them, how could they be useful and if anything would make my site look less reputable to them. Cheers! FYI - the spider Screaming Frog (one of my fav tools) just integrated with the GA API, so you can crawl and get GA data combined. (You can also just play with GA filters as well).  If Screaming Frog can get the tool to access the Moz API - BOOM! That would make this work so much easier.  (Hint hint mozzers this would be an amazing tool for the Moz crawler as well!)

    | CleverPhD
    0

  • Hi , Sorry for the late response, been away for a few days. Yes,. it does help alot. Many thanks for your help PEte

    | PeteC12
    0

  • Aw, thanks, EGOL Whoa - looking at those SERPs, I've been a busy bee.

    | MiriamEllis
    1

  • Those pages will need to be recreated and then have a 301 redirect placed on them directing all existing authority to be passed onto the newly created city targeted pages. I suggest pulling the data before the redesign that shows how much traffic was being sent to those pages that were killed and showing this to your client. Doing this should help them see that whoever suggested killing those pages in the redesign shouldn't be making decisions. Once you've 301'd the old pages, that are showing 404's, to the new city targeted pages go into Webmaster Tools and submit them to be indexed/crawled right away.

    | montana.marsden
    0

  • I wouldn't recommend this approach. You definitely won't get penalized but you'll hurt your domain authority. Subdomains are treated by Google as different websites (here's a great Whiteboard Friday about it) so by creating a subdomain instead of a folder you're diluting your "potential". The better approach for your situation is create folders with geo specific locations and high performing/volume keywords. An example of this would be www.domainname.com/city-stateabbreviation-keyword (http://bestdefensega.com/woodstock-ga-lawyers or better yet use more "intent specific" long-tailed keywords like http://bestdefensega.com/woodstock-ga-lawyer-services ) This way when your geo specific pages get recognition from Google (whether through user data, link equity, etc.) it's not being diluted, it's full "potential" is given to your root domain which will help boost all of your pages. Hopefully this helps! -Jacob

    | montana.marsden
    0

  • Hey Patrick, Adam's most important tip is to use creativity to not make these page read in a robotic, repetitive fashion ... that applies to how you write all tags, as well as main body copy. (Good point, Adam!). Personally, I wouldn't worry about a number of times you repeat a keyword in the text. Trying to meet numeric quotas can kill creativity. Write as beautifully and helpfully as you can on every page you publish, and you'll probably find that you are naturally optimizing all tags and text without having to jump through any hoops to do so.

    | MiriamEllis
    0

  • Thank you for taking the time to respond! Michael

    | YourHollywoodPortrait
    0

  • Hi Lora, Yes, it's definitely possible! I work for Rover.com - if people search for "rover" Google's not sure if they mean us, Land Rover, Mars Rover, the movie The Rover, etc etc. I'll admit that we don't often compete with trending news, but we're competing with companies and concepts that are much older and better known than us. Here's what I've found: **Google likes variety. **So, even if you can't rank #1 for the name of your company, chances are, you're going to rank #2 after the celebrity/news, even if the celebrity/news has enough content to take up the entire page. If people are searching for your brand, they're going to see your results, even if you're #2. (In case this isn't a given, you want to make sure that you make your company _stand out _from the celebrity/news, not try to pick up on interest from it. Then you'll be competing with all the news media outlets.) **Google understands that two separate things can be named the same thing. **We never rank for "Land Rover" or "Mars Rover" and neither of those two very, very popular subjects rank for "Rover dogs." Google's pretty good at filtering out results that aren't a good match for all of the terms in a search. **Most searchers are willing to refine their searches. **If all else fails, searchers are probably okay with searching for your brand + your service if they don't see your site when they only search for your brand. An advantage here is, if people are looking for your brand, they won't be distracted by completely unrelated information, like celebrity news. This would be a bigger problem if this celebrity news were also a business competitor. Hope this helps! Kristina

    | KristinaKledzik
    0