Came across this last night too. http://searchengineland.com/from-microdata-schema-to-rich-snippets-markup-for-the-advanced-seo-162902
It talks about microdata schema and rich snippets.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Came across this last night too. http://searchengineland.com/from-microdata-schema-to-rich-snippets-markup-for-the-advanced-seo-162902
It talks about microdata schema and rich snippets.
Search Engine Land did a write up a few years ago about a 30% increase in CTR for results with structured data. http://searchengineland.com/how-to-get-a-30-increase-in-ctr-with-structured-markup-105830
Here's an actual Case Study by Jason Jersey of SEOVoom on Structured data on CTR and Rankings in general http://seovoom.com/central/structured-data/
Matt Cutts covered your question in Webmaster Help http://youtu.be/OolDzztYwtQ
My personal thoughts:
I think anything you can do to help the search engines know what your site is about and how it is structured will ultimately lead to more traffic. Matt's explanation is pretty good in that it the structured data may help you show up in certain places that you normally wouldn't show up because you didn't have it before. NewsArticle Schema is a good example of that. But don't let these sway you either way. I use it as much as possible. I also recommend using data highlighter in Google WMT if you haven't done so. It helps Google even more.
There was a post by Barry Schwartz over at Search Engine Roundtable about Google reducing Rich Snippets by 15% or so. Matt Cutts basically said that it would remove snippets for low quality sites. I personally think they are getting ready to ramp up testing for AgentRank and giving snippets to "authority" authors (just a hunch/guess as I predicted in 2011, after their patent update, that AgentRank (aka AuthorRank) would roll out in 2015. http://www.vzpro.com/2012-seo-prediction/)
Don't think of the word count as metric. Think about your users (I know that's becoming cliche now). Does a 200 word post help them with something or at least useful. I don't know of many that are. Sometimes that sort of short winded post may be better off on Google+, FB or Twitter.
Neil Patel wrote a really good article a year ago this time about the more words the merrier. Over 2,000 words in a post seemed to rank it. http://www.quicksprout.com/2012/12/20/the-science-behind-long-copy-how-more-content-increases-rankings-and-conversions/
Also John Doherty wrote on the MozBlog about the types of posts that get links and he did some analysis to back it up.
Simple answer is Both.
Having variations of your keyword on a given page will help with rankings for both keywords. I assume (we all know what happens when we do this) Google will consider these synonyms and after I did the following search strings www.google.com/search?q=e-commerce&pws=0 and then google.com/search?q=ecommerce&pws=0 I think my assumptions are correct. There is a mix of e-commerce and ecommerce in the both SERPs. That being said, I would have both variations on my page. It should natuarally be linked to both ways and with co-citations working in SERPs, it won't really matter. If you just "built" links to your page with one version, that won't be very natural. I personally would add e-commerce a few times and ecommerce a few times (if appropriate).
On a personal level, I think of e-commerce not ecommerce.
Google on Synonyms: http://googleblog.blogspot.com/2010/01/helping-computers-understand-language.html
Matt Cutts on Variations of Words: https://www.youtube.com/watch?v=NpnnXt7CHMU
I've noticed that separate sitemaps (via sitemap index file) help my sites get indexed much faster. Just one sitemap hasn't been as successful indexing my sites as using one for video, one for categories, one for images, one for news one for pages and one for mobile versions of your site etc.
You can set each of their hierarchies separately and it works best for me. I also like to use Google+ for the sites. I've seen some high correlation between indexation and use of Google+.
Google info on using Sitemap Index Files https://support.google.com/webmasters/answer/71453
Here is the actual protocol from Sitemaps.org http://www.sitemaps.org/protocol.html
I won't be doing it. The Search Engines have changed since the early 2000s. Do you have a .us or .info to protect your brand? To match my domain authority and brand awareness from Google, they would have to tons and tons of work. "Protecting your brand" implies to me that someone will be pretending to be you and sell stuff that you offer. They'd have to outrank me for my keywords and brand awareness. I don't see that happening. EMDs could still be worth it but people naturally search for things vs typing in the exact domain name with correct extension. (I know Google had an EMD update but that effect mostly thin content EMD) If you are truly a "brand" with recognition, people know what outlets you have. If someone is pretending to be BobGW.blog, I find it hard to believe they would be overtaking your efforts. I just don't see it working out like it did early on in the web. Brands matter now and microsites (though they still work) are going by the wayside.
Is there a none of the above option? .com is still king and will always be. Until a major brand uses one of them as their MAIN url, it won't happen. If Overstock.com couldn't make O.co stick, then I don't see ANY of these working very well. I think a Brand may use them for a specific purpose but it won't be anything significant. If I had to guess I think .shop would work. I could see nfl.shop or something like that catching on.
There are mixed opinions on the use of them by Google. I happen to think it's zero for all of them except H1 and I'm not convinced even of them.. There is a case for H1 tags and keywords in them but I've ranked pages without strong H1 tags too.
It will depend on the design of your site but typically header tags are bold and slightly larger than the standard text on your page. You can check your websites files and change them around if you want to change the sizes. Think about reading a text book. It's nice to have the chapter titles one size, the sections another size and subsections another size. It's for your users. You want to make sure they can easily find what they want on your page. When we code sites, we generally indent too. That's not for Google, it's for us to find things faster. Same concept.
Your first thought is correct. You should use them to show hierarchy of content.
Directly from Google's Search Engine Optimization Starter Guide (page 20)
"Heading tags (not to be confused with the HTML tag or HTTP
headers) are used to present structure on the page to users. There are
six sizes of heading tags, beginning with
ending with
Since heading tags typically make text contained in them larger than
normal text on the page, this is a visual cue to users that this text
is important and could help them understand something about
the type of content underneath the heading text. Multiple heading
sizes used in order create a hierarchical structure for your content,
making it easier for users to navigate through your document."
There isn't a limit to any of the header tags as far as I know. You are correct that most of us suggest that you only use one H1 tag. I generally don't put a limit on a page of the number of H2, etc tags I use. Remember, you are building your pages for your users and not for Google. Use your header tags to help people navigate information on your page.
There was a great Whiteboard Friday by John Doherty (formerly of Distilled) on smarter internal linking
http://moz.com/blog/smarter-internal-linking-whiteboard-friday
and here is the follow up post
http://moz.com/blog/internal-linking-strategies-for-2012-and-beyond
Also, Ken Lyons wrote an awesome post over at Search Engine Watch back in mid 2012 on internal link structure too.
Sitewide links are okay but they need to be relevant for EVERY page they are on. Don't let the quantity of a link profile concern you. I have a client that has less than 100 links to his site but has a Domain Authority of 41 because they are from really high authority sites. Quality of links matter and especially RELEVANCY. I recommend keeping sitewide links to a minimum. You can over do them.
I have an old client that used tweet adder to do this. It would auto follow and then after a set number of days, it would unfollow people who don't follow you back. I don't know if they still use it but I do know that tools like this can easily get your account suspended. I don't recommend them. I've also heard of manage flitter.
I think it's a great way to develop future content. If you have good analytics tracking on your Q&A (or FAQ section for some sites) you can take the questions that get a ton of traffic and make posts out of them for your main site. Also, if you have a particular section/topic that gets a ton of questions about it (ie panda, penguin, hummingbird, etc) you could develop a webinar for the people that come to your site.
As you develop your site and the Q&A section becomes widely used, you'll start to see questions show in Google searches. In general, as long as you develop that section for your USERS and not for Google specifically, you'll get the traffic you want and more sales (assuming that's the goal).
Wanted to add that there will be a Mozinar on December 17th called "Content Marketing in Boring Industries" by Ross Hodgens. I would definitely take the time to watch it. Here is the link to sign up. https://www3.gotomeeting.com/register/474514734
The REASON you got indexed is important. If it's because of linking then I would totally start over. The only trump card to that would be if you have strong brand signals. YOU CAN RECOVER but you'll have to determine if your metrics are strong or weak. If you had very strong metrics behind your site, then you can recover within a year and be back to where you were. If your signals are weak, then the effort to correct your issues will be the same time/money it would take to start over and build up domain metrics. Your Brand will matter. If you have good brand mentions then maybe staying on that domain is the best option.
Moz is great because it provides metrics and information about your site. SEO is still an art and a science. The tools that Moz has won't rank your site. They will help provide invaluable information about your site. You'll still have to choose the best path for your marketing strategy. For example, if Moz tells you that you have 25 pages of duplicate content, it doesn't do anything for you unless you know how to act on that information. If you haven't done so already, I highly recommend reading over Moz's beginner guide to SEO. I'm not saying you should learn SEO (you probably already know alot) but the guide has information in it that may help you think about how to use the tools that Moz offers. (http://moz.com/beginners-guide-to-seo)
I run a few campaigns in Moz and each of them are set up completely different. Here is a help guide for each of the tools here on Moz and how they can be used to extract data. http://moz.com/help/guides/research-tools. If you look at the left sidebar, you'll see links to the specific tools. On each of those tabs there are videos, a how-to section and FAQs. By going through each tool and seeing what each does, you'll be able to see how to effectively use the tools to generate the type of data you need to develop an awesome web strategy.
Currently Google Authorship only supports one author per post as far as SERPs go (http://googlewebmastercentral.blogspot.com/2013/08/relauthor-frequently-asked-advanced.html) but I would recommend adding bylines for all authors on a post so if/when Google allows for multiple authors you have everyone show up there.
Technically no. Authorship is meant for individual people whereas Publisher is meant for Brands. However, Publisher may show up withyour Google+ logo if a person who follows you is signed into their Google Plus Profile. You should have set up rel=publisher to identify cardekho as the publisher of the site and whomever is the author should have a byline. If there are multiple authors then each author should have a byline. Google will only support one image in the SERPs however and it will generally be an author.
Again this should be a rel=publisher schema markup issue. Having authorship show up in search results does tend to lead to higher click through rates but for pure ranking factor as related to Authorship, I think the jury is still out. I still think domain level metrics are more important then author stats.
Miriam, you are a rockstar! I just want to add my two cents.
Most citation sources find your listing by phone number so putting more then one phone number per location will generally be rejected or merged by those sources. I strongly agree that each of your locations should have a separate number. Your NAP (name, address phone number) consistency is also a ranking factor. (Here is Miriam's post about local seo ranking factors http://moz.com/blog/top-20-local-search-ranking-factors-an-illustrated-guide)
If you are building citations for each of your doctors, I recommend separate numbers for them too (but it's not required). I use a company called ifbyphone.com. They have a basic service plan at $49 per month and $2/month per phone number plus minutes. You have them forward to one number if you want but it's a good way to get around the issue. That's about $200/month plus minutes but you can use these numbers for multiple things like marketing too (ie AdWords, Billboards, Radio Commercials, etc)
That being said, Google Maps said
The reason I recommend different numbers is for third party citation sources. If you can justify the $200 per month expense, I would highly recommend using separate numbers for each doctor. You'll be able to build strong rankings that way. I always worry about Google changing it's policy in this area so I think that separate numbers is a better idea.
Here is some guidelines from Google Webmasters Help on Duplicate Content with tips to resolve issues.
Jeff's answer is very good.
I just wanted to add that if your sites link to one another, I would highly recommend having separate IPs and even separate hosting accounts if possible. Part of Google's patent on rankings are referring domains and the IPs in which those domains are on. I've personally seen where moving a domain that I own to a different host with a different IP dramatically increased each of the sites rankings. (this may be just correlation and not necessarily causation but it's worked for other people I've worked with too)
First of all, be careful about what you outsource. I think the last thing you should be outsourcing is your content. It's okay to get content ideas of places but to outsource your content is not the right strategy in my opinion. Secondly, one or two blog posts a month would be enough to help your rankings. Google likes freshness and if you provide it to them they will generally reward you.
Think about your user when it comes to relevance. Your blog posts should have something to do with your industry. If your site is about marble restoration and you start talking about cars then you might have an issue. I read blogs that interest me and I go back to blogs that talk about topics that I am interested in on a regular basis. Being all over the place isn't the best strategy for repeat readers. What that means is that you could write about the history of marble, marble in famous buildings, cleaning and maintenance of marble, marble vs granite, why does marble come in different colors, etc?
Here is a little trick I use to get content for my client sites. I go to iwriter.com and for $3 to $5 I can get a 300 word article on a particular topic. I use the article as a basis for the article I end up writing. You can also use oDesk and for around $25 to $50 you can have people do internet research on everything you ever wanted to know about marble or historic buildings that have marble in them.
You own the business and at the end of the day, no one will know more about your clients (and potential clients) then you. You have to write for the people who will want your services, not for the search engines.