I havnt tested or tried it yet, but here's your answer.
http://seogadget.com/links-data-excel-seomoz-api/
You'll need excel + SEO tools for Excel.
I'll give it a bash a bit later and see if it works.
Greg
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I havnt tested or tried it yet, but here's your answer.
http://seogadget.com/links-data-excel-seomoz-api/
You'll need excel + SEO tools for Excel.
I'll give it a bash a bit later and see if it works.
Greg
Hi,
If your title tag is too long, or if it contains to many similar words (or repeated words/phrases) Google will use what it thinks is better.
Keep your title tags short and descriptive and don't include too many different keywords 2 at most in my opinion.
Greg
Mozbot usually crawls once every 7 days and takes a day or two to complete the crawl.
Give it at least 7 days.
On the bottom right under "Crawl diagnostics" for your campaign, it should say when the next crawl is scheduled.
Greg
I like the 3 organic traffic metrics in SEOmoz PRO:
1) Organic Search Visits
2) URLs Receiving Entrances Via Search
3) Non-Paid Keywords Sending Search Visits
They rock - especially for a high-level view of SEO / Organic performance.
Is it possible to setup these metrics as custom reports directly in Analytics?
Does anyone have these reports setup?
Or maybe there are similar custom reports that give a similar high-level view?
I would leverage the domain name and create a sub domain. Its still the same company, but with a different service so a sub domain would work best imo.
I would milk the sister domains, but don't wait for a ban. Work out a way to avoid the duplicate content flag by either linking to the original content, or by using unique content.
It will take much more time and effort, but worth it if you want to maintain the traffic coming from those sites.
It might not work so well in the long run.
Your HTACCESS file will get very big slowing down the redirect process. I cant say how long it would take, but the bigger your htaccess file, the slower the redirects.
Try to get your targeted keyword in the front of the Title tag if possible. Your example above doesn't dilute the page as it doesnt compete with any of your other page (blue paint, green paint etc), it just adds some enticing copy to encourage clicks.
When you are faced with a challenge of either using a highly searched exact match term like "Red Paint" but feel that it doesn't read well in the serps, go with what users would prefer.
Example:
Best for SEO :
Red Paint at Great Prices - CompanyName
Best for SEO + better click through's:
Quality Red Paint at Great Prices
Use the first example as much as you can, (the exact match keyword) and use the second example when the exact match keyword doesn't make sense or doesn't sound enticing.
You want your title tags to be as specific about a page as possible. Only include the most relevant keywords in you Title tag. Synonyms are welcome, but not keywords unrelated to the page.
In the case mentioned above, a page about "Red Paint" should have a title tag with ONLY
"red paint | redish paint | blood red paint" etc.
Greg
Perhaps Advertise the job opportunity on the "other" sites with a "enquire" or "find out more" button that links to the original job add on your website.
The button "view or apply to job" links back to the original advert.
As long as you are linking to the original source, you are safe.
Greg
Relevance is important, but then so is the domain authority and trust.
A link from a news website for example. Its unrelated, but can provide a powerful boost for your websites "trust"
Your main objective in my opinion should be to get links on relevant websites, however you can mix the two by getting links from high authority, unrelated websites as well. There are a few guest bloggers on SEOMOZ who link to their companies websites totally unrelated to SEO,
lets say you already had a mix of relevant as well as unrelated high authority links and you had to choose between DA80 (unrelated) and DA40 (related)
I would go with the most powerful site, DA80
Greg
Bumbed!!
We uploaded over 150 new product pages, and after 2 weeks Google had still not indexed them. After a few hours of trying to figure it out, all the new pages had a canonical link to the parent page (it used the parent page template)
We removed the canonical tag, but we are still waiting!
Reminder to self: Check all technical stuffs when publishing new pages!
Hi All,
I am looking for websites with keywords in the domain and I am using:
inurl:keyword/s
The results that come back include sub-pages and not only domains with the keywords in the root domain.
example of what i mean:
What I want displayed only:
www.keyword/s.com
Does anyone know of a site command i can use to display URL's with keywords in the root domain only?
Thanks in Advance
Greg
It really depends what your intentions are.
If you are look at it technically, 20 links from 20 different domains i believe will be more beneficial in terms of improving your link profile and ranks.
On the other hand, if the audience on that website is closely related and you think you'll get allot of referral, click through traffic from the 20 different articles, then that could be an option.
Adding up the pro's and cons mixed with the intent of your promotion will give you your answer.
If your looking at improving your link profile (ranks) then i would go 1 on 20 rather than 20 on 1.
Greg
You cant have both versions indexed in Google. www and non www are 2 separate pages and seen as duplicate content if you use both.
As Google has already indexed the non www version, then I would redirect all your pages with www to pages without www to avoid duplicate content.
My understanding is this:
If its a duplicate, as in a copy and paste article, then Google will eventually de-index the duplication and keep the original article.
In your clients case though, they are providing the source links, so google doesnt label it as duplicate content, but sees it as syndicated content.
Look at news sources for example. The same article syndicated on multiple sites are all indexed and stay indexed. (this is the case for your clients site)
What i would tell your client is that fresh and unique content on their site is key for SEO. By syndicating articles, it doesn't provide any benefits for SEO in terms of unique and fresh content, so the operation is pointless unless its for user experience only.
Give them an example, say its the same as giving away articles to other websites, and then reusing them on their site as "second hand" articles. Just because its word press doesn't mean its any different to any other website out there.
Good luck!
Greg
Hi All,
To my knowledge, Hcards for rich snippets are generally used by bloggers etc.
I would like to use the business logo rather than a photo of myself, as well as the business twitter, G+ local profiles and Bio for the business.
The Author page will be the "about us" page with info about the company.
Is this OK? Or does it have to be personal?
You should only no-follow your tags and archives and not your categories...
In the plugin settings, under permalinks, there is an option
"Strip the category base (usually /category/) from the category URL." this will just stop the duplicate pages from appearing,
Blocking the category's must have caused the drop.
Greg
Well, the duplicate content is causing issues alone.. Google does not like duplicate pages at all...
If you select which are your primary pages, and tell google to ignore the rest, it can only help your ranking.
With the Yoast SEO plugin, all you need to do is set tags to no-follow and no-index, and also strip the category from the URL. (it redirects automatically, as well)
Greg