I am currently using this one: http://www.feedaty.com/eng/
Posts made by max.favilli
-
RE: Is Trustpilot worth the money?
-
RE: Schema, aggregate ratings and trustpilot
In addition to all the good things said by others. Be careful about the following:
- if you add AggregateRating markup for "ACME Inc." it will probably (as you may know google takes into account a variety of factors, not just the markup presence) show up, when people are querying for "ACME Inc.", but if your organization is "Potato and Tomato Inc.", and the user query is "Potato Tomato" won't show up.
- don't confuse google guidelines for "Reviews" with "AggregateRating", you may have multiple reviews on the same page all nicely marked up, it's perfectly fine for google, but you may have only one AggregateRating markup per page.
Of course you can have AggregateRating plus additional schema.org markup on the same page, like reviews, but only one AggregateRating markup.
-
RE: Is Trustpilot worth the money?
Have you run any test on your own to measure CR improvements due to TrustPilot or any other review platform?
-
RE: Is Trustpilot worth the money?
Based on my experience asking customer for reviews using an external authority like TrustPilot have a higher redemption than just sending out a review request using your own brand. For the best results (again in terms of redemption) the review request should be sent out using a sender email address from your organization, and be personalized, but it should be clear at all stages it's an external authority who is collecting it and giving the customer the guarantee it's managed objectively.
Said that, I don't particularly like TrustPilot, I don't know if they changed things, but when they approached me:
- there was no option to import existing reviews
- you was not the owner of the reviews and you could not download them (to keep them if you decide to cancel the contract with TrustPilot)
- there was an upfront fee while competitors had not,
- the sales guy was pretending their widget would have magically made star rating appear on SERP, which is not true.
Once you sign with them, and they set up your business page on their website you get a do-follow backlink from a DA69 website, which is nice, but that's the only SEO benefit you get.
The SERP star rating depends on the schema org markup, if you markup your pages appropriately, it may show up, TrustPilot reviews has no influence on google decision to show star rating for your pages in SERP.
Different story for seller ratings in AdWords, in this case TrustPilot or any other review platform who has a partnership with google will influence your seller rating, and star rating will appear if your adwords account and review match AdWords requirements for seller rating.
As for Conversion Rate, in my experience, showing TrustPilot badge improved CR between 1.5% and 2.5% (depending on website), tests where run changing nothing else, just adding the badge and the review widget, quite far from what they promise.
So I would suggest to choose a review platform and use it in place of an inhouse solution, for the purpose of maximizing redemption. Again, in my experience, using an external provider improved redemption by 3X factor.
But I would look elsewhere for some other platform than TrustPilot.
-
RE: Fetch data for users with ajax but show it without ajax for Google
Cloaking is very dangerous, and the most common reason for google to use his axe.
If you code has anything similar to if googlebot then, you are at risk.
But in this case you do have a solution which theoretically should have no negative effect. Google has been sponsoring that technique of serving a static content on first load and update it with ajax.
But let me stress what it means, serve static content on first load, and update with ajax. Which means no cloaking, don't serve a different content (neither different code with same looking content) to visitors than google bot.
Additionally, it is very important to please visitors and serve content fast to them, but at the same time it's important to serve content fast to googlebot, since speed is a ranking factor.
-
RE: How can I tell Google not to index a portion of a webpage?
Correct you should make sure is not used elsewhere.
But I can't refrain from stressing again to hide the content is unlikely the best strategy.
-
RE: How can I tell Google not to index a portion of a webpage?
You can iframe those chunks of content and block with robots.txt or just meta tagging noindex in the iframe source.
But I would not, if you can't build a plan to make the content unique just canonicalize, or let google choose which page to pick among the duplicate bunch.
-
RE: Do only paid adwords appear in google shopping
Thumbs up for Monica answer.
I can only add in my experience google shopping campaigns are worth every cent. CPA is yummy.
-
RE: Cart Abandonment Solutions?
I agree with Hutch re-targeting/re-marketing does help, we tried adwords/facebook/adroll/criteo depending on your target CPA some maybe out of question, in my experience the cheapest is adwords, but it doesn't have the largest reach.
Based on my little experience I am also very much skeptical about all those cart abandonment packaged solutions, the real improvement is negligeable. In my experience A/B testing like mad (ux, copy, discount, exit popup, etc...) is what really bring double digit improvements.
But I do agree with Ryan, a professional CRO company (like http://www.conversion-rate-experts.com/ seems to be, but I never tried their services) would probably help identify what to test and shorten the time required to get results.
-
RE: Trailing Slashes and SEO
That's true, and you may be right. But as ignorant as I am, it's the very first time I read url slashes canonicalization is not best practice.
And I truly mean you are probably right, but yet I would like to see some piece of proof, like a video of Matt Cutts, an article on searchenginejournal, or something, before to call what was once considered best practice dead and buried.
-
RE: Trailing Slashes and SEO
Google official blog think otherwise:
http://googlewebmastercentral.blogspot.de/2010/04/to-slash-or-not-to-slash.html
-
RE: Question about understanding Google Ranking System
First, I really think you are underestimating the importance of on-page optimization. If you read that backlinks are all that it matters, it's true, but given that the html code is not a mess. And even so, when comparing two websites with a similar backlink profile, the more on-page optimized wins.
Second, let's take hotme.ca, for which keywords are you analyzing its ranking?
-
RE: Google Signal for Site Speed: PageSpeed ranking, Time To First Byte, or something else?
In GWT (Crawl->Crawl Stats) you can find "Time spent downloading a page (in milliseconds)", that's really TTFB.
I have seen few times a correlation between an improvement in TTFB and ranking. So I am convinced TTFB matters a lot.
PageSpeed is a different story, it does analyze a lot more speed factors, mostly client side. It does analyze TTFB too called "Server Response Time" in Page Speed. How these factors impact ranking I don't know. Some like "mobility issues" are officially getting you out of mobile idex if not fixed.
Some of PageSpeed suggestions are even controversial and debated in google own products forum (like suggestions to inline styles and javascripts).
Personally I try to fix as many PageSpeed suggestions as possible, but applying some common sense too. While I always religiously try to lower TTFB as much as possible.
-
RE: Ajax, SEO and Angular
When working with angular (or any client side js framework) and ajax you have to stick to one simple rule, load content you want to feed to google bot on first load, without requiring an ajax round trip.
So in your case those first 18 rooms description is content google crawler will eat, digest and index, all the others dynamically loading using ajax won't.
Same goes for the 400 characters, show them on first load, don't retrieve them later through an ajax call.
Keep in mind if you want to show content gradually you can just serve it on first load to the client and show it gradually through javascript with ng-cloak + ng-show/ng-hide.
One word about ng-show/ng-hide, there have been some debate (even here on MOZ Q&A) around css display: none; because John Muller and Matt Cutts stated few times google doesn't like hidden content, meaning they do not index it, and they may penalize websites for hiding content. In my opinion it was clear they were referring to navigation menu or manipulative techniques.
To quote another of Matt Cutts videos "google knows today web is dynamic and content is show and hidden by user interaction", so I didn't expect google to penalize a fari use of ng-cloak (display: none;).
But these kind of things always worries me, so I tested it on few pages with 200/300 words content, which were already indexed and ranking around top20/top10 in SERP, adding some ng-cloak content on some, removing ng-cloak and just showing the content on some other pages. The results was no change at all, they didn't move, not a single a position one way or the other.
One final notice about gradually loading more content, like in your example, additional rooms after the first 18. Of course those are not going to be indexed in that page. But here you have to think about your content strategy for that listing page, and what is important to feed to google crawler for indexing. 100 titles of rooms? Do you really expect people to reach that page searching for one of those titles? I don't think so.
-
RE: Trailing Slashes and SEO
Blog Engine is just a dotnet framework. Use dotnet url rewriting to redirect properly, you just have to edit web.config to have iis redirect urls with trailing slashes to urls withiut trailing slash.
-
RE: Will linking to very similar (or duplicate content) hurt SEO
Yes, linking from duplicate content means potentially loosing SEO value.
google could detect those articles as duplicate and do not include some in the index, result the backlink will bring no value to you.
if google do not detect those articles as duplicate you should still ask yourself if it's really what you want to be linked from extremely similar articles. Because that's a signal, the content of the article where the link is placed, google will use to choose the keywords you will be ranked for. Same article content same keywords you are feeding google.
-
RE: SSL, www issue. Should we buy WWW license or just add redirect from www to non-www site?
regardless of https you should redirect
choose if you want to show www or not in the url and redirect the other
that of course if the content is the same, if you are serving different content it's another story
ssl certificates come as standard or wildcard, if you need to secure a bunch of third level domains it maybe cheaper to get a wildcard
-
RE: 1,023 blocked malicious login attempts. Who trying to steal my blog? Any advises?
Honestly, I would strongly suggest to avoid blocking traffic on a geographic basis, these days you never know where traffic will come from and why.
User sitting in the building next to yours but accessing internet from a corporate network may appear as connecting from China.
Legit bot from services you are paying for may appear as crawling from Sweden, and other legit bot you don't even know about but which let you reach additional audience may appear as connecting from the other side of the world.
Blocking traffic is positively dangerous, the only case where I would consider it a good decision is when blocking blacklisted ips, and even this case I would suggest to secure the blacklist is updated regularly to avoid blocking false positive.
-
RE: Question about understanding Google Ranking System
That megamenu is giving that page too many external links.
Choose a main keyword for each page and optimize for it, choose some (few sematically related) secondary keywords and try to optimize the page for them without jeopardaizing the first.
Please go again through MOZ guides and blog post and very nice whiteboard friday videos to better understand what "optimize" mean.
Once done if you will look back at the source code of those pages you will notice yourself they don't smell nicely.
Depending on the keyword competition level you may need to get some backlinks for those keywords, which means generate good quality links from relevant websites, within relevant content, without a spammy anchor, etc...
Work page by page, keyword by keyword. It's a long and draining process.