Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi Beytznet, Some questions: Is your national business virtual (like an e-commerce website) or does it have actual, physical locations where face-to-face business is transacted with customers? If you have physical locations, does each have a unique address, not shared by any other business? Does each physical location have its own local area code phone number? Some advice: If you do have staffed physical locations at which in-person transactions happen at unique addresses with unique local phone numbers, then you qualify for inclusion in Google's local index. Think of a chain store like Whole Foods. It has a corporate website, but also has a local listing for each of its locations. This would be the model any such chain would follow. Appearing the local results will depend first on the fact that Google already shows local results for your queries. If Google doesn't already provide a local pack of results for your queries, there is no way to prompt them to do so. If Google does show local results for your desired queries, then you must pursue high rankings via a variety of efforts including, but not limited to: Running a strong, excellent website that works to build authority Creating a unique landing page on the website for each of your physical stores, with the complete name, address, phone, preferably encoded in Schema markup Creating a violation-free Google+ Local page for each of your stores and linking each listing to its respective landing page on the website Creating citations for each of the stores on third party local business directories. Earning reviews Earning links, doing social outreach, video marketing etc. If you do not meet all of the requirements I've mentioned, you do not qualify. You can read the complete Google Places Quality Guidelines here: https://support.google.com/places/answer/107528?hl=en Hope this helps!

    | MiriamEllis
    0

  • Sure - I can confirm this behaviour because we`ve experienced it already. Google is very anxious to show content which is closely related to the country of origin where a user starts the search query... but what you describe there will take some time to show up... but by the way: I am not sure if your efforts will be honored that much by Google... it`s a little step and the impact on SEO might by questionable according to the time you need to make these changes?!

    | dotfly
    0

  • Hi Howlusa, My 2 cents: I prefer .com/birthday-parties-chicago It's very easy to understand and should stand you in good stead.

    | MiriamEllis
    0

  • Casey - Let me take a quick stab at your question. I think that you are asking what affect a non-followed link vs. a followed link has on SEO rankings for a site? If so, then the answer is that generally, nofollow links coming into your site don't help your SEO efforts. That said, the real answer is a little murky, and I'll take a stab at explaining it below: According to Moz's Open Site Explorer, 2.97% of all links they found were nofollowed, out of 106 billion URLs and 150 million root domains. http://www.opensiteexplorer.org All of the links from Moz.com's QA section are no-followed, as an FYI. Wikipedia's external links are no-followed as well… and this was done as a means to reduce abuse of the system and prevent people from using WikiPedia as one giant inbound link source. According to the nofollow WikiPedia entry: http://en.wikipedia.org/wiki/Nofollow … Nofollow links were originally suggested to stop comment spam in blogs, and in early 2005 Google's Matt Cutts and Blogger's Jason Shellen proposed the no-follow value to address the problem. Generally speaking, no followed links don't help your site from an SEO perspective. That said, Google has left it a bit open with their answer on how they handle no followed links: https://support.google.com/webmasters/answer/96569?hl=en How does Google handle nofollowed links? In general, we don't follow them. This means that Google does not transfer PageRank or anchor text across these links. Essentially, using nofollow causes us to drop the target links from our overall graph of the web. However, the target pages may still appear in our index if other sites link to them without using nofollow, or if the URLs are submitted to Google in a Sitemap. Also, it's important to note that other search engines may handle nofollow in slightly different ways. Here's what Matt Cutts of Google says about nofollow links: http://www.mattcutts.com/blog/pagerank-sculpting/ "Nofollow links definitely don’t pass PageRank. Over the years, I’ve seen a few corner cases where a nofollow link did pass anchortext, normally due to bugs in indexing that we then fixed. The essential thing you need to know is that nofollow links don’t help sites rank higher in Google’s search results." It's possible that no followed links can actually hurt you, if you have too many of them, and you're trying to use followed vs. no-follow links within an internal link structure of your site to "Page Rank Sculpt" pages in your site: According to Wikipedia: On June 15, 2009, Matt Cutts, a well-known software engineer of Google, announced on his blog that GoogleBot will no longer treat nofollowed links in the same way, in order to prevent webmasters from using nofollow for PageRank sculpting. As a result of this change the usage of nofollow leads to evaporation of pagerank of outgoing normal links as they started counting total links while calculating page rank. The new system divides page rank by total number of out going links irrespective of nofollow or follow links, but passes the page rank only through follow or normal links. Matt Cutts explains that if a page has 5 normal links and 5 nofollow out going links, the page rank will be divided by 10 links and one share is passed by 5 normal links. http://www.mattcutts.com/blog/pagerank-sculpting/ Back to Wikipedia, though… Google states that their engine takes "nofollow" literally and does not "follow" the link at all. However, experiments conducted by SEOs show conflicting results. These studies reveal that Google does follow the link, but it does not index the linked-to page, unless it was in Google's index already for other reasons (such as other, non-nofollow links that point to the page). I hope this helps! -- Jeff

    | customerparadigm.com
    0

  • Just as Miriam points out, ranking in the local map pack will be tough outside your specific location, but local content pages can rank and bring leads. I'd consider using blog posts or a project showcase to create location specific content that is outside the main location you have.  If you title a blog post "Cityname Kitchen Remodel Project" and maybe include other niche keywords you can create great content for a prospect as well as search visibility in other specific areas. A solid post or project showcase page with a text overview of the project, bulleted list of products used, before and after photos, are all great content additions to the site.  Go a step further and create a video recap of the project that can be placed on the page as well as added to YouTube and optimized there can create an additional opportunity to land in the SERPs for a local search.

    | BrightHealth
    0

  • For those that are interested, we figured out the root of the problem and the fix. Maybe this will help someone down the road. Thanks to all that responded. As it seems all the domains/accounts hosted on our VPS are top level; so we couldn’t use robots.txt to fix this. The problem: -We have a handful of websites using a shared IP address – Only sites with SSL cert get a dedicated IP -Some of these websites have additional A record DNS names (mail.domain1, ns5.domain2, etc) that all resolve to that shared IP address -Within your browser if you went to one of those records it would pull up the website (account) that is first alphabetically within that shared IP pool -It’s assumed that a website or multiple websites somewhere in the web must have linked these domain names somewhere on their webpages – so search engines crawled them for the content of that first alphabetical website -Search engines then indexed the content of that site for these other DNS A records and displayed the alternate URLs in the SERPS The fix: Edit Apache’s httpd.conf file on our VPS – adding a new (first listed) virtual directory of that shared IP address to point to the servers cpanel default cgi page instead of listing one of our hosted accounts first. This page describes what we needed done and what fixed it for us: http://forums.spry.com/cpanel-whm/1568-how-can-i-change-default-page-ip-address-points.html Thanks again!

    | Motava
    0

  • Hi Dustin, You're right—in general, a drop in DA that corresponds with a drop in your competitors' DA is most likely due to changes in the Mozscape index (though not the algorithm so much). It's best to look at DA less as an absolute value than as a benchmark against the competitors in your space. Matt

    | MattRoney
    0

  • I wouldn't do anything that could look spammy, and I'd say that might. The traditional way to use an acronym is to place it in brackets directly after the first mention of the written out term i.e. "Yesterday I bought a bacon, lettuce and tomato sandwich (BLT)" and then use the acronym for the rest of the article. Write for your audience though - do they know what a BLT is? If so I might use BLT in the title. Have you done any keyword research to see which would work best? Saying all that, Google is getting better at recognising synonyms. If I type BLT into Google, I get results for Bacon, Lettuce, and Tomato too - Bacon, Lettuce, and Tomato is bolded in the search results, so Google know they're often the same thing. Edit - I've just had a thought - using an acronym might not be the best example of different phrases. Check this out: http://www.searchenginejournal.com/what-is-latent-semantic-indexing-seo-defined/21642/ and http://moz.com/community/q/latent-semantic-indexing-does-this-help-rankings-relevance You used to be able to search for synonyms in Google using the tilde ~ but they dropped the feature earlier this year.

    | Alex-Harford
    0

  • We see this too.  We have canonicals in place, and we still see the error.  And there's no insight into which parameters are causing issues.

    | Aggie
    0

  • Miriam, you are a rockstar!  I just want to add my two cents. Most citation sources find your listing by phone number so putting more then one phone number per location will generally be rejected or merged by those sources.  I strongly agree that each of your locations should have a separate number.  Your NAP (name, address phone number) consistency is also a ranking factor.  (Here is Miriam's post about local seo ranking factors http://moz.com/blog/top-20-local-search-ranking-factors-an-illustrated-guide) If you are building citations for each of your doctors, I recommend separate numbers for them too (but it's not required).  I use a company called ifbyphone.com.  They have a basic service plan at $49 per month and $2/month per phone number plus minutes.  You have them forward to one number if you want but it's a good way to get around the issue.  That's about $200/month plus minutes but you can use these numbers for multiple things like marketing too (ie AdWords, Billboards, Radio Commercials, etc) That being said, Google Maps said "Some doctors may share the same office address with other doctors. If the listings have different doctor names, they are not duplicates, even if they have the same phone number. The same goes for lawyers, insurance agents, etc."  https://support.google.com/mapmaker/answer/1731387?hl=en The reason I recommend different numbers is for third party citation sources.   If you can justify the $200 per month expense, I would highly recommend using separate numbers for each doctor.  You'll be able to build strong rankings that way.  I always worry about Google changing it's policy in this area so I think that separate numbers is a better idea.

    | DarinPirkey
    0

  • As long as you have the canonical tags set up correctly, then you don't need to do anything else with the URLs that have parameters in them. Google will eventually start dropping those URLs from their index.

    | StreamlineMetrics
    0

  • Yeah, Penguin 2.1 was confirmed. We're not clear if there are data updates outside of the official updates or not - Google hasn't been very forthcoming on that. I have heard of recoveries since 2.0, though. In general, recovery stories are very limited. Penguin is brutal, and the people I know who have been successful have had to make deep cuts. Also, keep in mind that, if you cut deep but it's not the right cuts, that may not work either. It's a difficult road, and it depends a lot on the depth of the problem and whether the site has enough good links and a solid enough base for Google to take the link removals seriously.

    | Dr-Pete
    0

  • For geo-redirects, I do not recommend you use 301 redirects.  Browsers can cache these, so if you tell a browser in Canada that example.com should redirect to www.example.com/ca-fr, and later the user changes their language to English, and then tries to go www.example.com, the browser could use that redirect again to go back to the French version without hitting your server.  301 tells the browser that www.example.com ALWAYS (permanently) goes to www.example.com/ca-fr.  Page rank isn't really a consideration with these, since Googlebot always comes from the US, so it should never hit these redirects.  If example.com always goes to one of the versions via a redirect (i.e. you don't serve content under that root URL), then you do have a bit of problem with redirects.  You don't want to 302 Googlebot to another page for your home page, but at the same time, you want to avoid weird redirect behaviors for your customers. Google can visit the international versions directly without redirects, right?  They should have no problem indexing those pages then. I agree with István, get some local links to your different local versions, register them each with Google Webmaster Tools (and Bing), put up sitemaps for each, and implement the hreflang tags in your sitemaps (or pages).  That way Google can easily index each version, and knows exactly what each version is for.

    | john4math
    0

  • Hi Tom, Thank you for taking the time to reply. And we do want to go the way of changing the URL in the listing page but had a little confusion over the need/type of overview page. Currently, the url consists of /used-peugeot/used-toyota-corolla but with the new URL (/used-toyota/corolla ), we was wondering if I needed a kind of an overview page for each manufacturer (like /used-toyota, /used-honda)? In the existing URL, there is one generic overview page (used-peugeot) which is not ideal but as most of the listing vehicles are peugeot, it had been the case up until now. So will keeping the /used-peugeot overview page while implementing new URL (/used-toyota/corolla) be a good idea or will we have to have an overview page for each manufacturer (e.g. /used-honda, /used-kia) and so on? Once more thank you very much for taking the time to reply.

    | nirpan
    0

  • Hello KTW..., I would not make these two pages have the same rel canonical tag, as they are not the same page, nor the same product: http://wagavenue.com/what-happens-in-the-dog-park-pet-id-tag-for-dogs http://wagavenue.com/very-important-pooch-custom-pet-id-tag-for-dogs Yes, they are the same "shape" and made out of the same material, but that would be like an eCommerce site for clothes that was being told they should add a rel canonical tag for all of their cotton T-shirts to combine them on one product page for "cotton T-shirts". That's not a product page. That's a category page. You can have some shared content on product pages about shipping, returns, etc... but if I were you I would invest in as much unique content as you can on each page. Talk about the sayings. The colors. The material. There are a million ways to say the same thing if you're creative. Failing that, you can put the shipping and returns information, and similar templated info, in an iframe or a javascript pop-up window, or some other way of keeping from being duplicate content on every page. Long story short, if you want to rank for 100 different sayings (e.g. "what happens in the dog park dog tag", "very important pooch dog tag") you need to write unique descriptions for each of those. There is no way around it without risking your traffic from Google.

    | Everett
    0

  • Thanks guys - I'll keep persevering. Hopefully I'll sort this out shortly!

    | DHS_SH
    0

  • That's an interesting thought Luke. Yes, I agree something like that would work much better. I think a group like that would need some strong affiliations with already recognised online groups of like-minded SEO people (like on Moz) to give it gravity and value, but it could work. I don't know if such a group exists. Peter

    | crackingmedia
    0

  • Catherine, If you have a blog you should be using structured data. But in any case that couldn't be the reason of the drop. I am guessing competitors using some shady techniques to outrank you and others or perhaps lack of content. I do see only 18 pages were indexed. Having a blog and posting regularly can help you. For example posting experiences from your customers? or any extra info you can provide on the locations you offer skiing (I don't know). Do a little research and see what's missing in other pages and get your hands on it. In any case, if the site outranking you are spammy, you can always report them to Google, they don't pay too much attention, however, you could then throw a thread in the Google Webmaster Help forums (if you didn't do any backhat link building, which apparently you didn't. Once you start creating good content, over time you will earn links which ultimately and combined will give you an edge over your competitors. That just needs time, time and more time. Hope that helps!

    | FedeEinhorn
    0

  • Hi Luke To include the suggestion on searchenginewatch.com in this conversation, it said: Submit an updated sitemap to Google Webmaster tools and use the change of address function if moving to a new domain. Remember to initially keep the old URLs in your XML sitemap to facilitate Google crawling those links and processing the changes in their index. Well it would be interesting to hear others feedback on that. Personally, I think having old URLs in a sitemap (that without a redirect would result in a page not found 404 error) doesn't seem correct to me. Presumably, you have had the URL in the sitemap previously when the page at the URL was active. But then, by setting up a 301 redirect, you are telling Google that the page at the URL that Google has in its index has now permanently moved to a new URL. When you submit a sitemap to Google then you are submitting a list of all the URLs on your site that you are asking Google to crawl. But to include the old URL in your sitemap along with the new URL is essentially asking Google to crawl two URLs pointing to the same page. I'm not sure Google would necessarily consider that to be a canonical issue (because the old URL is now not current) but for me it's a misuse of the sitemap. But as I say, it would be interesting to hear others feedback on this. Peter

    | crackingmedia
    0