Maybe I'm misunderstanding the question, but I notice that the page at https://roadtrippers.com/ has the following meta tag in the header:
Best posts made by DougRoberts
-
RE: Our root domain is no longer appearing in search results
-
RE: What is a Achievable SEO Growth Rate?
It's really difficult to provide a answer here as there are just so many variables and I think it's important to be realistic when talking to the client. SEO doesn't work like a tap that you can just turn on to get more visitors.
As Alice mentions, for a new site I've seen the same thing with slow initial growth until you hit a tipping point and the site achieves some rankings that have some reasonable traffic. At this point there can be a surge of traffic followed by (hopefully) steady growth at a slower rate afterwards (but you're got to keep building the content etc) to avoid hitting a plateau.
It does of course depend on factors such as visitor demographics, size of the market niche you're operating in, competition etc.
I've worked on one site where the entire market for the service being offered is less than 1000 people in the UK. (Assuming we're working to get qualified traffic then you can see that we're never going to get high month on month growth)
Also watch out for other factors such as seasonality or weather. (I've had one site have record visits due to the wet summer in the UK. Now there's the first glimmer of sunshine I've got a traffic decline on my hands.) Are the number of people looking for jobs affected by external factors such as the economy and the time of year? When are companies likely to recruit? How do holidays, annual budget cycle etc play a part?
The key to making any prediction traffic wise is to understand the target audience. Knowing when you need to broaden your appeals or come up with other services/products in order to bring in new visitor and keep the traffic (and the business growing).
Taking your job portal as an example - how many people are looking for a job or looking to recruit and could use the services it provides? What competition is there? How much of the total market are you likely to appeal to?
Hope this helps!
-
RE: How is the keyword difficulty score calculated in SEOMoz?
Ben, it's search competition based on the Domain Authority and Page Authority for the top 20 search results on Google. The score doesn't come from adwords, that's used for the search volumes.
The links on this page explain how Domain and Page authority is defined:
http://www.seomoz.org/learn-seo
Joel's link provides a full run-down of the keyword analysis tool.
-
RE: Australian local business website on a dot.com - how do I ensure its indexed/ranked by Google.com/au as priority
I've got a client in a similar situation. They have a .com domain but they only offer services in Australia.
The good news is they rank just fine for the search terms that are relevant to their business (these are primarily queries with local intent).
In Webmaster Tools, you'll want to tell Google that your site is targeted to an AU audience. There's an option to set this under "Search Traffic" -> "International Targeting". Click the "Country" tab and then you can choose to target users in a particular country.
We also made sure that it was very clear from the content on the site that the business was based in Australia paying particular attention to local search optimisation (See: http://moz.com/learn/local ) - getting local business listings in reputable, authoritative directories etc.
It's best to think of it from a visitors point of view - what are the on-page signals you need to provide to give them confidence that they are dealing with an Australian site. (Local address, Australian phone number, mentions of relevant places / people / events, associations/badges/certifications from relevant Australian bodies etc...)
Hope this helps - Good Luck!
Doug.
-
RE: Rankings drop
First of all, don't make any rash assumptions. Ranking can fluctuate for a while especially following such updates. You don't mention how far you've fallen but I'm currently seeing you ranking at #9 for Turkey Recipes, which given the competition is pretty good! I wouldn't panic just yet.
That's not to say that there aren't things you can do to improve you position.
For a start, you're up against lots of very authoritative domains and big brands such as Jamie Oliver, BBC Good Food, Betty Crocker (US) and Channel 4. So not only are you having to battle for rankings (especially for the more generic keywords) but getting the click-through from the SERPs is going to be a challenge as people tend to be drawn to the names/brands they trust.
As a result, you'll want to make sure that the individual recipe pages are pulling their weight. Lots of recipes getting a few visits is going to be better than a generic keywords sending you very little. Look at which recipes are attracting traffic and which aren't and adjust your content appropriately.
These are lots of different factors such as freshness, seasonality etc. Recipes with pumpkin go nuts around halloween! So while I get that Turkey is the focus - don't forget your other ingredients!
You could definitely improve the on-page optimisation and increase the amount of descriptive text about the recipes. Just the method and the list of ingredients isn't going to help you capture long-tail keywords.
One recipe page I looked at consisted of just 165 words. (The html was ~150k!)
Schema markup for recipes can work well. Google has specific search options for recipes enabling you to filter by ingredients, cooking time and calories.
Look at how you can encourage ratings and comments. Lots of your competitions recipes have these!
From a user experience point of view it may also be worth looking at implementing a faceted search to enable people to search your recipes on your site by ingredients etc rather than just pretty bland top level categories.
I'd also consider a responsive design (are people are more likely to have a smartphone/tablet in the kitchen than a laptop or desktop). It would be worth looking at your traffic to see how much of it is coming from mobile.
Hope this helps.
-
RE: White Papers! Is this still good for SEO
From a "pure SEO" point of view, whitepapers (like any rich content) can really help get some good long-tale content on the site, and may earn some links if they're good.
My personal view is that whitepaper are more useful when it comes to conversion.
Whitepapers can be used to support your offering/your proposition by providing evidence that you've understood the prospects needs, delivered a great solution to ther problem that delivers real benefits to the customer. (and explain the unique benefits of your solution that they wouldn't have received from your competitors)
Such whitepapers, like other good content may attract links too (unless they're hidden behind a subscription form)
Well presented solutions to problems that your target audience experience are of interest to people who, as Casey mentioned, may be willing to provide their email address, join a mailing list in order to receive the whitepapers. Think about the value of building such a mailing list for the business.
I'd take a step back and think about what you're trying to acheive and how you can use white papers to support these goals.
Hope this helps.
-
RE: What's Your Thought About Market Samurai as An SEO Tool Compared to SEOMoz?
Market Samurai's keyword research is just a front-end to the google keyword tool. I must admit. I do like the ability in MS to keep the keyword research together, but find myself more often than not using Google's tool directly.
I don't find MS's way of assessing the competitiveness of keywords too useful. It's all very nice knowing how many pages are being displayed in the SERPS or have keywords in their title tag, but this doesn't tell you how easy it's going to be to get your page into a competitive position.
This is where I find SEOmoz's keyword difficulty tool incredibly valuable.
I prefer having SEOmoz collect my ranking/traffic data automatically over having to manually update MS.
SEOMoz give you a lot - on-page analysis, social reports, link analysis etc, but it really does depend on what you need and what your budget is,
My advice would be to try both and see which one helps you achieve your goals as quickly and easily as possible.
Based on the benefits I have directly experienced, and the added value I've been able to provide my customers, I would choose SEOMoz.
Hope this helps.
-
RE: E-Commerce SEO: Where to start with 4,000+ products?
There's no absolutely perfect way to prioritise the order to review/update the existing product pages. You've just got to determine which products matter the most to you.
As we've mentioned you want to think about prioritising the products with the most sales, products with the highest margin, new products, hot products (ones that have a social buzz about them!) etc.
You may want to rule out products that are near end of life and will be removed from your catalogue soon or products that have low sales and low margin (think about the ROI of your time/cost)
As for how many you do - you'll need to go with what you're comfortable with. The problem with trying to do too many is that it becomes increasingly hard to create great content the more you do. You're better off writing a few great ones than forcing yourself to churn through loads. You want to sound unique/fresh/natural. Hard to do when you're tired!
When it comes to writing the descriptions, comments/feedback is a good place to find out what really matters to people, the concerns people have, the suitability for a particular audience etc. (Tip: You don't have to look at the comments on your own site.)
If you've got a great social following, then you may be able to think of an interesting/creative way to utilise that to help you out.
Of course, before you start working your way through the list, make sure you create a checklist/process for adding new products so that the problem doesn't get any worse while you're working your way through!
-
RE: Is it possible that Google may have erroneous indexing dates?
Hiya Sorina,
When you use the custom date range, Google isn't listing results based on the date they were indexed. Google is using an estimated publication date.
Google tries to estimate the the publication date based on meta-data and other features of the page such as dates in the content, title and URL. The date Google first indexed the page is just one of the things that Google can use to estimate the publication date.
I also suspect that dates in any sitemap.xml files will also be taken into consideration.
But, given that even Google can't guarantee that it'll crawl and index articles on the day they've been published the crawl data may not be an accurate estimate.
Also, if the scraped content is being re-published with intact internal links (are these the full URL - do you they resolve to your original website?) then it's pretty obvious where the content came from.
Hope this help answer your question.
-
RE: Hover texts for hyperlinks
If the user needs to read the title text to understand the link then you've probably got your anchor text and context wrong. Using them just to get a few additional keywords in your copy isn't going to help and might actually act as a distraction/source of confusion for users...
... and touch screen device users find it hard to hover.
-
RE: Why doesn't Open Site Explorer recognize the link from my twitter profile?
I just assumed that the my twitter profile page hasn't been crawled (yet). I know OSE is showing twitter links to other sites. (I notice that SEOmoz.org has a bunch of links from twitter.)
Do you link to your twitter profile page from anywhere? Are those pages linking pages in the linkscape index?
Not overly worried about this as links from twitter are "no-follow".
-
RE: Extrapolating Google volumes from the Bing volumes
In order to do this with any degree of accuracy you'd need to have a deep understanding of who is searching for the particular query. Once you've got that demographic (nationality, market segment etc) nailed, you'd then need to establish the number of these type of users who use bing vs google and then multiply the volume to get something close to the Google volume.
BUT It's not something you really need to do and I certainly wouldn't suggest trying. What you're looking at when you're comparing keywords is the relative volumes between the different keywords you're researching as you look for that holy grail of high volume / low competition.
To get the Google keyword volumes you're going to have to use the Keyword Planner in Adwords I'm afraid - and even then the keyword volumes are bucketed.
I tend to have already established a list of potential target keywords/topics before I start my research using the keyword difficulty tool to determine whether they are realistic to chase and whether they should be short term or long term goals.
-
RE: Im I being hit by hummingbird?
Don't think it was Hummingbird.
Have you checked Google Webmaster Tools? Any messages there/manual actions?
I had a quick look at your backlink profile in AHrefs and noticed that something seems to have happened in December It looks like you've lost a whole bucket load of links. See this:
https://ahrefs.com/site-explorer/overview/subdomains/www.privateequityfirms.com
I also checked your domain using SEMRush and noted that in December you lost rankings for a number of keywords so I'd check your assumption that it's just your main keyword that's been affected.
What does your google organic traffic look like over the last couple of months?
So, if you've not already done so I'd check your Google Webmaster tools and take a look at what might have been happening with your backlink profile.
-
RE: Keyword Named Domains
Exact match domains can give your site a lift in the search results, but the exact impact depends on a number of factors such as the authority of the competition in the SERPS.
While exact match domains can mean you rank higher, you may find that any advantage you might get is compromised by a reduced click through from the SERPS. Searchers can view them as somewhat spammy and treat exact match domains with suspicion.
The other weakness of course if that exact match domains can only boost the keyword they're a match for, and the effectiveness of exact match domains has, and continues to diminish.
Registering exact match domains (or any domain names) and redirecting them to your own site won't affect anything. If they are new domains they won't have any links pointing to them, and as a result will not have any link equity to pass onto your main domain. If they have no content, they won't be indexed by the search engines.
If you purchased an existing, established (and relevant) domain and 301 redirected the domain to your site then you'll pass the equity from the links to that domain onto your main site. Be aware that this boost is unlikely to last long term as links decay over time. This can happen when a company buy a competitor for instance.
I don't think registering a new domain and forwarding it onto you site will harm your website. It's a common approach to register a friendly campaign domain and forward it onto a landing page on your main domain - or, if you have a microsite on the campaign URL, to forward this onto your main domain (to a relevant page) once the campaign has run it's course.
So, is it worth building a microsite on an exact match domain. Well, it depends on what you're goals are and what resources you have.
Can you build a microsoft/landing page that has enough relevance and authority to rank and what is the cost in doing so. What is the opportunity cost of not spending that same time and effort on your main site (which by it's very nature is likely to be more authoritative than the microsite.
How likely is it that you can build a site around an exact match keyword that will be strong enough to out-rank your established competitors? How many microsites do you see achieving this in the results for keywords in your niche?
What's the best experience for your customers? There's no point ranking for a keyword if ultimately it doesn't attract the right visitors, who ultimately reach your conversion goals.
You'd probably be better off, concentrating your efforts on your main site, developing your content there and promoting it to your audience.
Hope this helps.
-
RE: Why is it that certain keywords in my seomoz report card are for the wrong urls
Hi Ewan,
From the On-page help:
"The On-page summary automatically generates reports for any of your campaign keywords that rank in the top 50 of your primary search engine. The URL that it grades is the same URL that appears in the search results.
For instance, if you have 75 keywords ranking in the Top 50, you should have 75 On-Page Reports. This generation happens automatically within 24 hours of when your rankings are updated."
So, looking at the particular example you've mentioned, it means that your homepage is ranking higher than your specific "Chiang mai" url. If you do a search, where are your pages ranking? Where is your homepage and the particular landing page ranking?
Your home page is likely to have significantly higher page authority than one of your sub pages. but shouldn't have as much relevancy than the specific landing page you've created.
You can manually check the on-page optimisation by using the On-Page Optimisation under the Research Tools menu: (Here: http://pro.seomoz.org/tools/on-page-keyword-optimization/new )
As well as your keyword, you can enter the specific URL that you want to check. You can use this tool to make sure that your specific topic page is optimised for the topic keyword.
You may want to take a look at articles on Keyword Cannibalization such as this one:
http://www.seomoz.org/blog/how-to-solve-keyword-cannibalization
Is your specific page included in the search engine/google's index?
You may find that if it's a newly created page that hasn't been crawled yet, or if the page contains significant amounts of duplicate content it may not appear in Google's index. You can check this by using site:[URL] in google.
If your page doesn't appear, you might want to check your site navigation to make sure that your page is crawlable (also think about how any link equity is going to flow to your target page from your homepage/high page authority pages.) You may also want to check your robots.txt and your sitemap if you have them.
You can try using the Fetch as Googlebot option in Google Webmaster tools. Once you've fetched your target page you can submit the page for inclusion in google's index.
Hope this helps.
-
RE: Keyword tracking over time
It depends on a lot of factors, so as Travis and Moosa has said, there really is no fixed rule here.
What resources do you have available (time/money/people) to build any additional content/assets that you need to target new keywords? How easy is it to make changed to the website in order to do the on-page optimisation?
What the competitive landscape like?
How mature is the site and what's it's current performance in search? If the site already has a strong domain authority maybe you can start to target those juicy head-terms that'll bring you lots of traffic? Putting your effort into one or two keywords can make a big difference.
If your site doesn't have the authority you need to compete for these terms. Then you'll potentially need to look at a wider range of longer-tail, less competitive keywords. These won't have the same traffic volumes, but at least getting a share of the search traffic will be achievable.
What is the market/niche your site is in? I've seen sites that are so specifically targeted that there really isn't a broad and diverse range of keywords that the site can target. I've also seen businesses struggling to find search traffic that just isn't there - their customers just don't think to look for them on the internet.
In summary, the key really is to prioritise. Think about what's relevant to your business and what is achievable (be honest about this) with the resources you have available. Small incremental growth is better than trying to bit off more than you can chew.
There's no reason not to have short/medium/long term aims for the keywords you want to target.
Also remember it's not just about the volume of traffic, but the relevancy too. A small number of visitors that buy your product are better then thousands that don't.
-
RE: Starting every page title with the keyword
Best practice is great starting point, but you need to work out what works for your audience, your offerings and your business.
For instance, having a call to action in your title can make a positive difference ("find" is a bit generic, but things like save, download the guide, buy now, etc can work, if it connects with the searchers intent.)
Luckily page titles are pretty easy to test - you'll need to keep an eye on your rankings and traffic and measure click-throughs for a suitable period depending on the search volume and taking into account any seasonality etc. As well as the traffic you receive, also look at the conversion rate too - especially important if you're testing for intent.
You can always tried a couple of variations in Adwords to see how they perform, especially for you more important keywords / pages.
The approach you take regarding your titles also depends on the type of page, the nature of the business, your specific business goals, the strength of your brand etc.
Take a good look at the other sites appearing in the SERPS and the titles/descriptions they're using. Put yourself in the place of your audience and try to see what's going to work and what isn't and how you might be able to differentiate your page from the rest.
Also remember that titles have to work in conjunction with the description. While the description isn't used for ranking, it can take some of the load of the title when it comes to supporting click-throughs.
Another point to consider is that Titles aren't just used in search engine results, but also when the page is shared / linked to etc. Depending on your site, you may want to adopt a slightly different strategy for your blog content than you use on a product catalogue for instance.
-
RE: The pages that add robots as noindex will Crawl and marked as duplicate page content on seo moz ?
What you're seeing is normal. What you want to check is that the number of pages is the number you expect.
From the SEOMoz FAQ on the Crawl Report:
We are still seeing duplicate content on SEOmoz even though we have marked those pages as "noindex, follow." Any ideas why?
SEOMoz is not a search engine index, it uses a crawler. If those pages are not blocked by the robots.txt file, then SEOMoz will crawl them. They ignore the noindex tag because they don't index anything. Search engines will honor the noindex tag and not index a page if you specify with the robots meta tag. However, to remove pages from the crawl, disallow them in the robots.txt or metarobots.(http://www.seomoz.org/help/crawl-diagnostics)
Hope this helps.
-
RE: How do I find the corresponding duplicate content pages from my SEOmoz report?
Hi Barry.
In the Duplicate Page Content report you can click on the number in the "Other URLs" column. This then lists the pages that have been identified as duplicates. If you've got loads, you might want to think about exporting your Crawl Diagnostics as a CSV file.
Hope this helps.
-
RE: Why my domain authority doesn't update
In a nutshell, your site isn't reporting any authority because Moz's crawler hasn't discovered any links to your site yet. Moz's index doesn't cover the entire web, so you can find site that are new and/or don't have many links don't get included in the Mozscape index.
You can find out more about why your site isn't showing any authority here:
http://moz.com/help/pro/open-site-explorer-faq
For info, Majestic site explorer is showing 155 links from 4 external domains. None of these are particularly strong.
Getting (earning) links from more authoritative domains will help Moz discover your site and increase your sites authority in the eyes of google too.
A couple of quick observations. Your site is pretty light on content. I would definitely recommend investing in creating more content around your audiences goals, problems, concerns etc. If you become the definitive/trusted resource you'll earn those links.
Also, Looking at the site in google, there appeared to be a large margin on the right hand side of the page, pushing the content off the right hand side of the window. I had to scroll across to the right to see it.