You should take a look at the metrics inhttp://www.opensiteexplorer.org
As you will see you have only 3 linking root domains, 23 total links, Page Authority 30, and Domain Authority 19.
Focus on getting inbound links.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
You should take a look at the metrics inhttp://www.opensiteexplorer.org
As you will see you have only 3 linking root domains, 23 total links, Page Authority 30, and Domain Authority 19.
Focus on getting inbound links.
Since we truly don't know Google's secret recipe, we can't say if PR is split.
I don't believe you will loose any Link Juice with duplicate links. You can have 10 links to the same url, but Google only sees this as 1 url.
Interesting article on SEOmoz:
http://www.seomoz.org/ugc/an-internal-link-juice-tool-14969
Interesting PR tool from that article: (PageRank Link Juice Calculator)
http://www.ecreativeim.com/pagerank-link-juice-calculator.php
I read a comment from SEOmoz's Dr. Pete:
"Google sometimes views navigation differently, but it's a bit hard to say. My general assumption with Google ignoring 2nd (3rd, etc.) links to the same Page B from any Page A, is that they ignore those links completely. Since they don't pass PR and their anchor text doesn't count, I'd tend to treat that page as if it had 10 links. If anyone has evidence to the contrary, I'd certainly love to hear it."
View Post: http://www.seomoz.org/blog/how-many-links-is-too-many#jtc131211
Just make sure that after you install the template, that you take a look at the code and make sure everything is fine for SEO. Remember that you can always modify the code to make it work for you.
I would follow the urls that GWT say are bad. This will give you the answers. Also, for all bad urls, make sure you have GWT re-crawl them."Fetch as Googlebot" and "Submit to Index" under "Diagnostics"
Just by taking a quick glance, I see many issues.
First of all, Speed. (Something wrong with the setup)
http://www.lifestyleblinds.com/home.php?cat=" />
I'm getting reports of many broken links.
The logo link to the home page goes to /home.php
Google has indexed 1,260 links with X-cart in the url: http://xcart.lifestyleblinds.com/Roller-Blinds-Karunda-Cream.html
Internal links have "?click=srclick" in the url: http://www.lifestyleblinds.com/Vertical-Blinds-Rustica-Cream.html?click=srclick
I would recommend CDSEO (An SEO Toolkit built for X-Cart) http://www.websitecm.com/x-cart-mods/cdseo-pro-x-cart-seo.html
Here is a great post about that topic.
http://blog.tamar.com/2011/10/google-rids-keywords-of-apostrophes/
Even if you make the links nofollow, you will still lose link juice.
Just focus on putting your best links towards the top.
Here is a great post explaining why you still will lose link juice.
http://www.seomoz.org/blog/google-maybe-changes-how-the-pagerank-algorithm-handles-nofollow
It will be great to see if anyone has any stats on this. I did recently read a good article similar to this on Hubspot, "How to Write Call-to-Action Copy That Gets Visitors Clicking"
I know it isn't exactly the answer you are looking for, but I feel they have some good points and suggestions.
OSE takes a while, so don't expect sudden results. Start analyzing the sites/pages these links are on in OSE.
For example, if www.example.com (where your link is from) isn't in OSE, then OSE won't find your link.
If adding links, try to keep the links within the same category. For example on the product page you provided, I would have links to the other 5 horse turnout blankets.
The goal with internal linking is to focus category and silo groupings. Or if you think of each page and remember that everything on that page should be related to what you want to rank for.
And to answer your overall question, I don't believe the changes you have made would hurt you. But really try to focus your links as groups.
As for sitemaps, if you have correctly designed your site, a sitemap really isn't necessary, so I would probably get rid of it.
schema.org is becoming the standard.
Here is a great post from SEOmoz: Schema.org - Why Your'e Behind if You're Not Using It...
It would have to be a significant amount of bad links coming in. And this would have to be an ongoing practice from the competition to do something like this.
At a quick glance, it looks like you might have over optimized the site.
I would slowly start taking steps backwards and see what happens with each step.
Also, looking at OSE, it looks like you are way too heavy on inbound links with "jobs in kent" (75 domains, 103,647 links) and "jobs around kent" (73 domains, 4,029 links)
Then from there it drops down to 4 domains with 16 links.
If you have your sitemap.xml and have submitted it to both Google and Bing, then you should be fine.
Minify css, use shorthand, avoid @import, old unsupported css code.
Since I'm not sure what kind of warning you are getting, it is really tough to answer your question.
Here is a great css tool: http://www.minifycss.com/css-compressor/
and you can Validate it here: http://jigsaw.w3.org/css-validator/
If you are going to work with an outside company to help you build links, then you should thoroughly investigate the company and make sure they are on the up and up. And when they start, really review their reports and watch very closely all metrics with your site.
Taking a quick look you have a ton of spammy bad links coming in and they are going to internal existing, and non existing pages.
57 - Linking domains to home page
3,766 - Linking domains to all pages
You have a lot of cleanup to do. I would start by setting up your account with Google Webmaster Tools and running an internal scan to start cleaning up everything.
Since SEOmoz doesn't provide all of their metrics they use to calculate, it's hard to say just how much the correlation is. But I would have to say that the correlation would be there since good crawl penetration and juice distribution leads to better indexing and a better user experience.
SEOmoz Domain Authority
Domain Authority represents SEOmoz's best prediction about how a website will perform in search engine rankings. Use Domain Authority when comparing one site to another or tracking the “strength” of your website over time. We calculate this metric by combining all of our other link metrics (linking root domains, number of total links, mozRank, mozTrust, etc.) into a one single score.
To determine Domain Authority, we employ machine learning against Google's algorithm to best model how search engine results are generated. Over 150 signals are included in this calculation. We constantly refine this model over time. This means your website's Domain Authority score will often fluxuate. For this reason, it's best to use Domain Authority as a competitive metric against other sites as opposed to a historic measure of your internal SEO efforts.
I would have to agree with Stephen. There isn't any content on that page.
It is alright to have outbound links, just make sure that they are relevant to the page it is linking from. I believe I have read that relevant outbound links are actually good for SEO. (But don't go crazy).