Hi Tung - welcome.
A lot of times when you search for a keyword that exactly matches your domain, Google will show sitelinks, as long as the keyword isnt generic, even if the site is relatively new. Looks like you just got cached today.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Tung - welcome.
A lot of times when you search for a keyword that exactly matches your domain, Google will show sitelinks, as long as the keyword isnt generic, even if the site is relatively new. Looks like you just got cached today.
I see, and yes it will.
I know for my real estate clients, the main listings page usually ranks naturally for info that is found in listings so for example "4 bedrooms" - we have a real estate client that ranks for "x real estate" and "x homes for sale" but also ranks for "4 bedroom homes for sale in x" simply because the listings summary have number of bedrooms in them (like yours does).
However for other variables, like "no pool", its gets trickier since no one lists a house on MLS citing "no pool".
The only two ways around this are: write unique content on every main page, and include the keywords you want like 'no pool' or
write some unique content for each variable - ie write some unique copy on the "no pool" page, write some unique copy on the 'waterfront' page, etc. Even then you are still running a risk of duplicate copy. Having the titles, breadcrumbs and h1's dynamically change just might not be enough. I would put all of my efforts (including linkbuilding) to the main landing page and just make sure to include the keywords i want (thats just an opinion).
What is the data showing now - are you being penalized? Are you ranking for any "without pool" or "waterfront" terms and if so, are they getting traffic?
Great points here by both Matt and Nakul. Of particular importance is understanding who will have access to edit, and site duplication. If you want each of your franchise owners to have editing capabilities for the own store, then it may be easier to use subdomains from a permissions perspective.
As for duplicate content, you'll need to worry about that regardless of whether you use a sub or a folder.
As a case study, Ive worked two very large projects like this, and in both cases I used subfolders for the locations. The folders were the city name in both cases. Each location is positioned position 1 or 2 across the board, and each location shows well in the map packs.
Hi Joshua,
There are a number of ways to stop Google from counting your dynamic urls as duplicates. Its unclear from your question why you can't use canonical tags for this. If you went here:
http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html
And add the canonical tag in the HEAD section:
It will solve your issue of duplication when people choose property variables like waterfront or bedroom #. I think you were trying to point out the reason this wont work at the end of your question but Im not exactly sure what you are eluding to there?
Hi Joshua -
If someone is linking to the www version then it doesnt pass as much juice as it would if it wasnt redirected (theres lots of info on this on the internet with varied options). Overall, most SEO's agree that an inbound link that points directly to a page without being 301 redirected has more of a positive SEO effect.
With that being said, in your case Google Webmaster Tools may be detecting this double redirect error simply because there is an external website somewhere linking to the 'www' version. You can find this using OSE or using the WMT by going to CRAWL ERRORS and looking for the sunny-isles url. Clicking on it (if its there) will show who is linking to you and from where.
BTW - when did you do the redirects, and how long since you noticed the new url wasnt indexed (and was the old URL indexed?)
Thumbs up to Kane's advice.
If you checked with your host and 'everything was fine' but 10 minutes later the site was working again, you might want to get an uptime tracker 
I agree with Dana - the only thing you would need to worry about is if the link was a direct text link using a keyword anchor that is used abundantly throughout your link profile. Even then, if its an authority site that is relevant this is not something to worry about at all. In fact its probably a great link (regardless of the sitewide instances).
The link you referenced has 'www' in it, is that how the link is targeted on your website? If so, its probably the double redirect that is causing the issue. Since WP is set to 'non-www' - every time there is a call for the www version of a url, WP automatically 301 redirects it to the non-www version. There is nothing wrong with this.
Its when there is a call for a 'www' version of a URL that has also been redirected, as the one you cited has, where a double redirect now takes place:
http://www.luxuryhome..../sunnyilses.html
to the 'non-ww' version:
http://luxuryhome.../sunnyisles.html
then from there to the new html file version:
http://luxuryhome.../sunny-isles.html
The header check shows a normal www to non-www redirect first (WP is doing this), and then the 301 redirect that changes the sunnyisles to sunny-isles. Both server responses seem OK so the redirects themselves seem to be working. What you want to make sure of is:
Any internal links linking to the old sunnyisles.html page do not contain 'www'. (And in any event, these links should be changed to point to the new page anyway).
Any inbound links from external sources do not reference the 'www' version.
It would be helpful if we cound see the htaccess file as well.
Nathan - the truth is that simply having more social media signals, more links and more indexed pages doesnt mean you will outrank a competitor. Although theses statistics may give you a better DA than your competitors, that metric is used more as a general reference. What you should be doing is breaking down the three signals above as follows:
1. Social Media: OK so you have a lot of 'likes' - are people tweeting, posting on your fb page? Bookmarking? Social activity is the big signal.
2. You have more links - are they relevant? Are they from separate relevant domains? I've had instances where clients come on board and say "I have 5000 links and my competitor has 200 and they outrank me". Further investigation shows that the '5000' links were from only 58 domains, most of which were spammy blog links that were not directly relevant in content (paid blog networks). The competitor however, had 200 links from about 150 domains, and many of those domains were specifically about the vertical. In other words, they were natural relevant links.
3. A lot more indexed pages. Again this means nothing - what matters is that each page has value, is unique and informative according to Google. In fact, if the pages aren't deemed informative and unique, having more indexed pages can be harmful. Just because Google hasn't dropped duplicate/spammy pages from their index doesn't mean that they are 'helping' you rank. In this case you are blogging so as long as the blog posts are unique then you should be ok, but always remember quality not quantity.
Pay close attention to what Ben mentions about Local SEO. If you poke around long enough with your keywords you'll notice obvious trends in your area (that may not equate to other regions) particularly with the words "in" or "near".
You'll start to notice how certain prepositions trigger the map packs. It's just something to pay attention to depending on your keyword + locale.
If implemented properly, the URL should regain its "pagerank" for whats thats worth. I dont believe that the age of the url will make a difference but the backlinks definitely have at least some effect. You will want to do your best to have inbounds changed instead of redirecting where feasible.
If I understand your question, you cannot redirect abc.com URLs from an htaccess file on ghi.com. That directive has to be placed on abc.com's htaccess file.
So the abc.com htaccess would show the specific redirects to ghi, ie:
redirect 301 /wines.html http://www.ghi.com/wines
def.com would also have an htaccess with redirects for each landing page:
redirect 301 /trade.html http://www.ghi.com/trade
ghi.com would not need any redirects since thats the site you want people to land on.
Unless you want all urls on abc.com to simply redirect to the root www.ghi.com, you have to write a redirect for every page you want to redirect (or use advanced code to rewrite every landing page to the new domain).
You do lose some juice on 301's so obviously the best course is to contact sites that house the link and ask them to change it to the new URL (once the new URL is live, and you would still 301 redirect). This isnt always easy to do particularly when there are 1000's of backlinks, so it really depends on how feasible an option that is, and how many backlinks you have (are there only 3? then having an optimized URL will probably be more beneficial than leaving the URL unoptimized, even if you cant have the links changed).
As to 'exactly' how much do you lose? I dont think anyone has a definitive answer. But I have worked with websites that 301 redirect almost every page when they migrate to a new platform and the SEO impact is not severe if done properly.
I still recommend mining your backlinks and having their targets chaged (at least for the more authoratative ones).
PArticularly if its the homepage, you really need to check the backlinks and the anchor thresholds. Although there may be no case studies, we have successfully recovered from both Penguin related drops and unnatural link penalties. The homepage is usually the target because the home page usually has the most (spammy) links.
I was going to respond to this however Mathew and Marie are spot on. First order of business is to check positioning for your products to see if you are being marginalized behind other pages with the same descriptions. (While youre in there check the pages Page Authority and backlinks to compare to yours as well). Check to see how many of your product pages are indexed. You can also do a litmus test on a few products by adding unique descriptions and waiting to see if there is improvement.
Marie is exactly correct - the unnatural links penalty and Penguin are two separate entities.
The best way to combat a penguin related traffic drop is to use your SEOMoz tools to mine your backlinks and then sort by anchor. Traditionally link builders would buy a majority of their links with the same anchor text (or close variations). Website that have a majority of their links as an anchor can get hit by Penguin (there are some instances where this isnt the case when brand and exact anchor domains are concerned, but generally this is the case).
If your links were natural, Google would see links of all types in your profile including "click here" and "www.url.com" etc. You should try and decrease your keyword link anchor threshold to below 30% by either getting rid of spammy keyword anchor links, or building more generic anchor ones.
Doug is spot on - its just that easy. Those csv exports literally tell you everything you need to know.
Sameer is also recommending a good tool, Xenu, if you don't want to use your crawl report. However, Xenu can cause issues (rare) with sites with dynamic content, or it can just keep crawling forever with newer dynamic sites.
Creating a site with multiple landing pages targeted to different regions is not new, therefore Google has made updates to try attempt to stop sites with low quality from capitalizing on localized keywords (miami keyword, tuscon x, san diego x, etc) where x is your main keyword.
What this means is that you need to do more than simply duplicate your pages and mix up the keywords, replace the local terms and create new URLs and titles/descriptions. What you should do is create completely unique copy, dynanic content and/or user engagement, local citations will help each landing page, and make sure to get local backlinks to each landing page.
Perhaps you should look at adding canonical tags to your pages - then you wouldn't need to worry about any dynamic URLs causing duplication issues.
As for unique descriptions for puzzles - sure its tough - but I recommend sitting down and getting it done, or hiring a professional to do it. I see lots of good advice here. But sometimes it can be overwhelming and just too time consuming to do yourself.
I'm curious to know if this is has been resolved. Like others here, I've been successful on several re-inclusions, so here is my two bits:
Template duplication is probably not related to the manual penalty.
Duplicate content will definitely tank you sites (but may not come through as a "violation" email in WMT)
The culprit is almost certainly backlinks. The bonus is that you have access to all sites - were any not penalized? When you got your backlinks, did you pay for packages across multiple domains?
When I do re-inclusions for agencies that handle multiple clients, I compile the database of backlinks using Open Site Explorer and the External Links section of WMT and then use excel function to find what links are common to all penalized sites and which are not.
By doing so you can really highlight the backlinks that are most probable for causing red flags. Remove those links, and write Google. Just tell them you understand you have bad links, and that you've removed as many as possible. List the links right in your email.