I just asked the Moz team this earlier this week.
They are working on functionality similar to this where you can add multiple users to one account; however, they have no ETA on when this will be available.
Mike
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I just asked the Moz team this earlier this week.
They are working on functionality similar to this where you can add multiple users to one account; however, they have no ETA on when this will be available.
Mike
It looks like it is pulling some localized results for that. I am in Minneapolis, MN and do not see rockymountainfurniture.com in the first 50 results; however, when I change my location to Atlanta, GA (I looked at the location in your profile), I do see them on page one.
Mike
Did you submit a new sitemap in your Bing Webmaster Tools?
Mike
Do a scan with Screaming Frog. It can point out exact pages where you are having broken links.
They have a free trial and a paid version.
This program is great for diagnosing many different website related issues.
Hope this helps.
Mike
Create landing pages for both on your website. So domain.com/location1 and domain.com/location2.
Then use GetListed.org and get both locations listed in the list of business directories.
Hope this helps.
Mike
A sitemap is like a map with driving directions for Google. Sure they can probably find their way through your site, but with a map they can get through more efficiently and make sure to look at all of your pages.
It is not required for a site to get indexed.
If you entire site is indexed, you don't "need" a sitemap; however, it is good practice to have one.
Say you add a new page and want it to get indexed quickly by Google, you would just update your sitemap and submit it to Google. It alerts them you have made changes and to reindex your site.
If you Google site:logobids.com you can see which of your pages Google has indexed.
Hope this helps.
Mike
Crawl Notices are merely a "heads-up" notice. You having a 301 redirect from http://mysite.com to http://www.mysite.com is perfectly normal and correct if that is the very you want displayed to users.
One thing to note, you may be getting this message if:
Hope this helps.
Mike
As long as you are linking out to relevant content, it is perfectly fine to be cross linking between sites (as long as you are not doing like a site-wide footer link or something spammy).
Search engines do not find it odd when two sites that are relevant to each other link to each other, because that appears natural.
Hope this helps.
Mike
Keyword density can help Google identify what your page is about.
There is not a certain % of keyword density that will give you good or bad results - the thing to keep in mind while you are writing is your visitors. You want to mention your keyword where applicable in your content; however, you do not want it to seem forced.
Best way to judge how well content is optimized? - I like using SEOmoz On-Page Optimization tool.
David Naylor has a pretty cool keyword density tool which returns some good info.
Hope this helps.
Mike
Hi Jack,
Server errors. Redirect loops. Disallowed pages in robots.txt. etc.
Here are some suggestions from www.seomoz.org/help/crawl-diagnostics:
"Why didn’t you crawl all my pages? I only got a one page crawl. Looks like you missed a bunch!
If you suspect you didn’t get a full crawl, or Rogerbot missed some of your pages, there could be several reasons why this happens.
If you are certain you don't have any of the above problems, then I would suggest contacting help@seomoz.org
Good luck.
Mike
Rand talks about co-occurrence/co-citation in this WBF video.
He gives some great examples of how this can potentially increase rankings. It is a very interesting video.
Hope this helps.
Mike
I believe it really comes down to personal preference.
Generally, when you see sub.domain.com, you usually see the non-www version; however, it really does come down to personal preference.
You are not going to be penalized or anything for using the www version.
It is slightly less work for the user to not have to type www.sub.domain.com vs just sub.domain.com - but I don't think 4 extra keystrokes is going to be that big of deal.
Mike
Hi Ed,
First - cool websites! Visually they look really nice.
According to Matt Cutts, Google does not consider bounce rate as a ranking factor. Think of it this way, if you Google weather in Minneapolis then click on the weather.com page, and see the high today is 36 degrees <sigh>then leave the page, you are bouncing. BUT, does that mean that weather.com didn't answer your question and should be ranking lower because of a higher bounce rate? That is why Google claims they do not consider bounce rate a ranking factor, because users could easily find the answer to their question and leave a site within seconds.</sigh>
That said, if you look at OpenSiteExplorer.org you will see that your SparrowMakeup.com.au site has very high authority, while your Make-up-Artists.com.au does not.
If I were you, I would incorporate the cool functionality of your portfolio into your main SparrowMakeup.com.au site. I understand your reasoning for having two separate sites, because of the different look/theme; however, I think when people are looking at a portfolio, they expect to see something more artsy and different looking. I think you could definitely get away with incorporating the same functionality from your portfolio site onto your main site.
Since it appear that you don't have any text on the portfolio page, it isn't super likely that you'd rank for that page, but directing someone to an "external" website might be confusing, since the portfolio is essentially an extension of the main site. If you want to continue to have it be a separate site, that is fine; however, I would recommend that you open your portfolio page in a new window... that way, once/if they close your portfolio window, they are brought back to your main website.
Does that all make sense?
Hope this helps.
Mike
It can actually take GWT weeks or months to actually remove these warnings from their reports.
As long as you have personally verified that they are fixed on your live site, you do not need to worry.
I just verified that your /estimate-request.html is using the description you stated above; however, Google is still using the meta description you had in place on Jan 30,2013.
Once Google re-indexes your page, it will appear correctly in the SERPs, but like I said, it may take months for this fix to be reflected in Google Webmaster Tools.
Does that help?
Mike
I would suggest removing them.
You are spot on with the fact that competitors can use them. Also, they add unnecessary html to your page.
In addition, most pages usually only target one, maybe two keywords, so having a list of keywords really would not be beneficial.
Mike
You sir are a gentleman and a scholar.
Thanks for your help Matt.
According to Google -
"How often does Google crawl the web?
Google's spiders regularly crawl the web to rebuild our index. Crawls are based on many factors such as PageRank, links to a page, and crawling constraints such as the number of parameters in a URL. Any number of factors can affect the crawl frequency of individual sites.
Our crawl process is algorithmic; computer programs determine which sites to crawl, how often, and how many pages to fetch from each site. We don't accept payment to crawl a site more frequently. For tips on maintaining a crawler-friendly website, please visit our Webmaster Guidelines."
Hope this helps.
Mike
That's a bummer Jesse.
I would not take any action until you read through the following articles below.
First READ THIS from Google.
From what I have read, you should make you sure you are documenting your attempt to get the links removed. Matt Cutts states that using the disavow tool without requesting link removal first
Duke Tanson wrote a great article on how he used the disavow tool to removal an unnatural link profile warning.
That should be all of the information you need.
Good luck.
Mike
Hi TJ,
I have actually made the switch to friendly URLs in DNN, so I know your pain.
Best practice is to make the 301 redirect in one jump. So instead of going from oldsite > dnn > dnn-friendly, you should set up your redirects to perform oldsite > dnn-friendly and dnn > dnn-friendly.
Redirecting with a chain A > B > C can slow your site down for users, as well as slow down the crawlers, which is bad.
Does that answer your question?
Mike
TextMarketing is spot on!
Either re-writing from memory or having someone else write the content based on a generic layout are two ways around having duplicate content.
And just for some additional info, this what Google considers duplicate content: "Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar." and "...content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results."
Duplicate Content = BAD SEO and BAD User Experience.
Mike