Could you articulate some more the "its a tricky like to walk"?
This kind of deep linking is something I have still not understood, and your answer intrigued me. Can you explain better?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Could you articulate some more the "its a tricky like to walk"?
This kind of deep linking is something I have still not understood, and your answer intrigued me. Can you explain better?
If you want the bes tool for link discovery or link checking use https://ahrefs.com/
I use OSE, I use Semrush, I tried MajesticSeo, in my opinion no one is near ahrefs when it comes to discover links.
I am a happy user of OSE, and I think MOZ algorithm calculating DA, PA MozRank, etc... Is doing an excellent work and OSE is always my first stop when I want to evaluate and compare websites or pages.
But no other tool I tried is close to ahrefs for link discovery.
Of course you are already using GWT, right?
Which title tags? Product pages? Brand pages? Categories? All of them?
And your doubts about what to put there are related to serp CTR? Ranking?
I think you should keep in mind <title>is very much related to CTR because that's what people will see in a prominent position in serp, so I think you should craft them to be as much related to the search query you are targeting as possible, and of course make it as catchy as possible. Unfortunately I am not aware of any solution to A/B test <title></p> <p>About SEO, as far as I know is one of the strongest asset when it comes to on-page optimization.</p> <p>It's few months old but bet still very much relevant: <a href="http://moz.com/blog/new-title-tag-guidelines-preview-tool">http://moz.com/blog/new-title-tag-guidelines-preview-tool</a></p></title>
I am very curious to know what you will find out, Jeff. Let me know when you have the results.
If I may ask, when you said your recent campaign was very successful, by which metrics? Can you share some details?
I am really hungry for FB campaign data, wish people was sharing info on FB campaigns more.
So far the likes are still there.
If it was some sort of ad testing, why liking the url more than once, or twice... If it's just a test? Why 60 times?
Furthermore, with your question you are opening a door into another mystery. In the past weeks I tested likes campaigns for fan pages. And what I noticed is that once I stop the campaign not only the likes rate per day decrease, but the balance between likes and unlikes in the following days and weeks stay flat, in other words the total amount of likes stay the same. Which sounds odd for few reasons, 1) who goes back to unlike something? I actually never did, 2) how could likes and unlikes balance each other regularly, week after week?
It sounds suspicious.
Especially when you consider FB removed the possibility to list users who liked something from graph api and fql.
All people reporting FB likes fraud in the past were looking at profile of users liking their page or url and finding strange things. And FB remove that.
I keep thinking it smell badly.
As far as I know google read meta description and analyze it, if it doesn't seems to fit the page content (according of course to google algorithm) instead of the meta description shows some paragraph from the page close to the keywords being queried.
But, actually... Looking at the page you mentioned, there's not meta description in the page source.
I did some testing with Facebook ads and resulting Facebook likes on target pages. And something is wrong!
Just after creating the ad, before the ad was approved, I could get many likes on the target page without even seeing any traffic on server logs or google analytics.
I documented my tests here. I repeated the tests 6 times, every time I could get tens of likes on the target page within seconds and without any approval of the ad.
These pages have little traffic and there's no way these likes could have been natural!
Everyone I reported was able to reproduce it, try yourself.
A little update. I found others reporting un-expected Likes pouring...
But at least they were "running" the ad, so externals to Facebook could have seen it and decided to Like it to later offer "liking" service as reported from this fellow.
But in my case the ad was not approved... So... Who can see it beside me and Facebook?
I am asking because I keep creating spam reports on google web master tools when I find them on competitors websites but nothing happen...
Anyone else had better experience with spam reports?
That's also my conclusion. That's an e-commerce for car tires, which is not a sexy product people enjoy talking about on FB.
We tried custom audience as well, I should edit the question stressing it's included in the remarketing we did. Thanks for the links I am going to study them now.
So, social these days is hot. But is there anyone who can report some figure about a real, documented, analytically documented, success in using social for e-commerce?
There's a lot of presentations like this one saying it's great, it's a big impact, etc...
But how? And what about figures of sales, or traffic, or something you can measure... Which had a significant spike thanks to a social network attempt?
I am asking because I tried myself to use facebook for e-commerce, and we tried a lot:
After 6 months and around 15k euro spent the result is:
My conclusion is either we are really dumb at using social for e-commerce, or we wasted our time...
Anyone had some experience to share?
Ray, my understanding on this is, one way or another, you have to redirect every single url, you can do it with “url rewriting’ (either htaccess on apache or web.config on iis) or some other technique but just redirecting the domain won't work. Unless the site url structure stay exactly the same.
Just to clarify any confusion, if an existing url is http://old-domain/foo and the new projected url is http://new-domain/bar just redirecting the domain won't work (meaning google will not transfer any juice).
We have been using angular for a while. And we love it. It's a fantastic framework.
Of course google crawler is not going to interpret such complex javascript, so when you use angular you must think about SEO.
One approach beautifully described by google here is to use query string "escaped fragment" to serve static version of the pages to google crawler, we used it in the past, in my opinion is complex, and it's not the way to go.
Another approach is to use server side code to render page content and angular to make it dynamic. For every page where you employ angular think how you can serve the content at first load without angular, and add angular to make the UX dynamic.
For example... If you have slider/carousel you may be tempted to load the content with angular using ajax, don't, load it in the body of the page and render it as a slider using angular. You can use ng-cloak (which is perfectly white-hat as long as you use it properly) and ng-show and ng-hide or ng-animate.
If you plan on using angular url routing is fine as well, just don't load the main content with ajax, put it in the page in a safe way for crawlers and use angular routing to choose which one to display.
You can load secondary content like menu options or form field options with ajax, as long as those are not essential for your on-page optimization.
Angular is a great tool and does speeds up development a lot, I am a big fan of angularjs.
Answer is simple, the markup is wrong, exactly for the reasons signaled by Google testing tool.
Take your time to read the properties allowed for each of the schemas you are using and fix that...
What was your moz page grade in the past?
And what about fixing the issues signaled by moz page grading as first thing?
These can be helpful:
http://backlinko.com/on-page-seo
http://moz.com/blog/visual-guide-to-keyword-targeting-onpage-optimization
My website is gomme-auto.it, the one with the rich snippets. The competitor doesn't have them.
Thanks for your answer.
What is frustrating, and the reason why I am comparing our pages with competitors is that we already did that, we added user reviews to the page, we made the content more rich improving the writing and we added some external links where appropriate, but didn't change the position of the page in SERP. That's why I am trying to figure out what we are missing, what google loves so much of those competitors page.
Well, I have been scratching my head on this for days, I will try throwing the ball to you with hopes someone more experienced than me can help.
The scenario is: e-commerce -> brand page -> SERP -> comparison between how two pages rank; one from my website, one from a competitor website.
The brand is Michelin, the keyword is "pneumatici michelin" (equivalent in italian of “michelin tires”).
I am not looking at SERP first page, where competition is surely much more fierce. I am looking at position 11: http://www.cambio-gomme.it/marchi/michelin/
And my page (not in the first 50): http://www.gomme-auto.it/pneumatici/michelin
My page:
MOZ Page Grade (for keyword “pneumatici michelin”): A
External backlinks to the page: 1
Domain Authority: 29
Page Authority: 24
On-page SEO optimization:
keyword density: 0.87%
internal links: 145
external links: 3
page size: 108kb
html size: 24kb
words on page: 2077
link-words: 408
non-linked words: 1669
time to first byte: 0.419s
Competitor page:
MOZ Page Grade (for keyword “pneumatici michelin”): A
External backlinks to the page: 0
Domain Authority: 26
Page Authority: 13
On-page SEO optimization:
keyword density: 0.75%
internal links: 70
external links: 1
page size: 31kb
html size: 9kb
words on page: 1521
link-words: 168
non-linked words: 1353
time to first byte: 0.373s
Domain age is very similar, both websites launched close to each other in 2012.
Ideas? Suggestion on other metrics to compare?
I agree it's dull, and I hate it myself. But it surprisingly works.
As far as I can tell from my experience:
Can get to some decent results.
So far I have not been able to find other ways of doing it.
We tried building some great unique quality content, complex expensive infographics with original analysis, we built some backlinks to that content, did the on-page SEO homework and got to position 1 of SERP, we emailed the exciting news to hundreds of journalists and news agency by email, with zero (repeat zero) natural backlinks.
While the dull, boring, and time expensive mailing to bloggers usually result in 2 out of 10 free guest posts, 3 out of 10 money request to publish the content, and 5 out of 10 ignoring us.