Hello,
Are there any plans to expand Moz Local to Canada?
In the mean time, does anyone have a suggestion for a similar tool for Canadians?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: SEO Analyst
Company: Vitopian
Website Description
Vitopian provides online services for small and mid-size businesses including search engine optimization and website development.
Favorite Thing about SEO
The idea of changing the overall internet landscape so it actually provides quality content.
Hello,
Are there any plans to expand Moz Local to Canada?
In the mean time, does anyone have a suggestion for a similar tool for Canadians?
My site is: http://goo.gl/ya5jGcWhen I perform a Google site search [http://goo.gl/S8hdQD], there are 96 results showing in the search engine results pages; there are 105 results if you select the "repeat the search with omitted results included."Some of our site's pages [http://goo.gl/9guHau for example] are not included in these results.These missing pages can still be found by performing a modified site search [http://goo.gl/3t2DSJ], which leads me to believe that these pages are, in fact, indexed.I was under the impression that Google's "site search" allows us to see all of the pages on a given domain that Google has indexed, but clearly this is not the case.So why aren't all of our indexed pages showing up when I perform a site search? Is there another method or resource that I can use to determine which pages are and are not indexed?
Is there much difference in the recovery process for either [Penguin or manual link penalty]?
Theoretically no, practically yes.
A manual penalty will be reviewed by the Google Spam Team. If you are not successful at removing the links, you will need to provide extensive documentation on the steps taken to remove the penalty. When Google manually reviews links, they will not remove the penalty simply because you adjusted anchor text. If the link is spammy, it needs to be removed regardless of the anchor text.
A penguin penalty can be algorithmically removed. Many SEO companies are simply manipulating the anchor text rather then removing the spammy links and they are getting away with it to at least some degree...for now. Another tactic is to "drown out" the links penalized by Penguin with other spammy links which do not use anchor text. These solutions are quite bad as these sites are subject to future penalties as Google improves their algorithms.
You rank as #1 in Google.com for "refund fx " which seems to be the focus of your home page.
The population of Australia is around 22 million. In comparison the world's population is 7 billion.
When you compare google.com.au to google.com it is a completely different ball game. You can rank #1 in google.com.au but not even make it to the top 200 in google.com.
If you wish to improve your ranking in google.com, you need to sharply increase the quality of your SEO. For example, your pages all show the Australian flag in the upper-left side bar. That doesn't seem like a company who wishes to have a strong international appeal.
If I want to target a keyword phrase to a particular phrase, but do not want to change the URL of that page, will that negatively impact my rankings?
A better way to say it is you are missing an opportunity to make a change which can positively impact your rankings.
A page's URL has a very minor effect on rankings. There are two larger secondary effects. First, if you offer a clear, relevant URL your click-through-rate may increase. Additionally, when others link to your page by copying and pasting the URL, you will naturally have good anchor text which is very helpful, especially in our post-penguin world.
Stefan correct shared there is not a reason to create a new short URL.
Google ignores the hash tag when indexing URLs. You can offer your home page with various versions of hash tags appended to the end of the URL and Google will not mind a bit. It will not case any issue for SEO.
A few more notes:
If you search Google.com for "Guitar History" you will notice the WIki page is listed first. (see attachment). The URL offered by Google is the page URL without any hash tag. Google does offer the ability to "Jump to History" which includes the hash tag link. That is a benefit to using anchor text on a page. Otherwise Google does not take the hash tag nor anything after it into account when indexing pages.
Rand offers a short video on this exact topic: http://www.seomoz.org/blog/whiteboard-friday-using-the-hash
I am not familiar with the exclamation point (bang) being used after the hash tag outside of twitter. The standard twitter URLs use it.
Summary - the hash bag is not the reason for your recent drop in rankings.
I am unclear what you mean by "Google still has thousands of the old hashbang (#!) URLs in its index." Can you share an example?
April 24th was when the Penguin update rolled out. An experienced SEO would need to evaluate your site to offer a solid diagnosis but most likely you have been hit by Penguin.
A look at your backlink profile strongly suggests you have been hit by Penguin as well: http://www.opensiteexplorer.org/links.html?no_redirect=1&page=1&site=www.emergencyglassrepair.com%2F
"Windshield Replacement" is used in 8 of your top 10 links. Your link profile is extremely unnatural which has led to the penalty. This penalty will require a huge amount of time and effort to have removed. Your site shows over 2200 linking root domains. The cost of fixing this penalty is so high you may choose to switch to a new domain. In short, you need to contact every site which provided an unnatural link (likely 2k+ domains) and ask them to remove them. If the site owners actually remove the links, the penalty will be lifted. If many do not remove the link, you will have to submit a very detailed record of every link to your site and all the efforts to remove the links. The IRS requires less documentation then Google does for this issue. Even if you win, they will "partially" remove the penalty.
Best Wishes.
Crimson offers a great reply and gets a thumbs up from me. I'll just add a bit.
Whether or not you submit a sitemap, Google will visit your site as long as it knows the site exists. If your site offers solid navigation, there is absolutely no need to submit a sitemap. Google will find and crawl all of your pages. If you have coding issues on your site, navigation issues, island pages, etc. then a sitemap is helpful so Google can be aware of these pages it would otherwise not be able to find.
With the above noted, a sitemap is easy to set up and automate. You can pretty much "set it and forget it" so it's still a good practice. About your questions,
1. It's your call. If a page is linked to in your main navigation such as About or FAQ then Google should find it 100% of the time. There is no need to include it in your sitemap but there is no harm either. Either way works.
2. Yes, as per the above as long as Google can find the page it will index them. You can even have horrible coding and navigation and Google may locate your pages if you have earned external links to them from credible sources.
3. Last I checked a sitemap can hold 50k URLs. If your site has more then 50k URLs, then you can break up the sitemaps into smaller files. The advice Crimson shared is correct.
In summary, if you implement all best practices in your site design and do not have any island pages then a sitemap is not needed but it is a nice backup.
** How do I tell Roger no to crawl these blank pages?**
Any easy solution is to block roger in robots.txt
User-agent: rogerbot
Disallow: [enter pages you do not wish to be crawled]
But a better solution would be to fix the root problem. If your only goal is to provide clean reporting to your client the above will work. If your goal is to ensure your site is crawled correctly by Google/Bing, then Jake's suggestion will work. You can help Google and Bing understand your site by telling them how to handle parameters.
I would prefer to fix the root issue though. Do the pages which are being reported as duplicate content have the "noindex" tag on them? If so, you can report the issue to the moz help desk (help@seomoz.org) so they can investigate the problem.
Hello Joe.
I have experienced this issue many times. I believe it is a bug in the mozbar. I have often found the bug resolves itself after a period of time. Sometimes if you reload the page it works correctly, other times not.
The most successful way to resolve it for me is switching from Chrome to Firefox browser. I like the SEOmoz toolbar in FF better anyway.
You are likely seeking SH404SEF: http://anything-digital.com/sh404sef/seo-analytics-and-security-for-joomla.html
We use Yoast's SEO plugin on all our WP sites, and the SH404SEF plugin on all our Joomla sites. Like Yoast, the SH404SEF extension uses best SEO practices and the author reads SEOmoz. If I had to compare the two extension, Yoast's would be considered better if for no other reason then it's more user friendly. Nevertheless the SH404SEF extension is awesome.
What I mean is lets assume you were to get a link from BBC or CNN how long does it take for that link to have a positive impact on your rankings?
The question is a bit vague so I need to make some assumptions. Feel free to correct me if I make any errors.
First, I am presuming you are referring to Google. Each search engine handles links differently. Next, I presume by "take effect" you mean the link will offer a positive benefit to the site's rankings. You could be referring to PR or even SEOmoz PA.
When Google discovers a link the target page will immediately benefit from the link. If you receive a link from the home page of CNN you will likely notice the benefits within minutes of the link being published. If your link is deeper in the site it may take longer to be discovered but most likely within a couple hours. If you receive a link from other sites the link may never be discovered or it may be deemed spammy and offered no value.
** if you do get a link is it normal that google would crawl your site and if so do you see this reflected in googles cache immediately?**
Once again, there are numerous factors involved. If you receive a link from a web page with 1000 other links, then Google may not follow the link at all. If you receive a link from the home page of CNN it is highly likely Google will follow it, but they may not do so on the initial visit. Whenever the link is followed, you can expect for Google to update their data.
Your site is very poorly optimized. You would benefit a lot by implementing basic SEO best practices. A good place to start is here: http://www.seomoz.org/beginners-guide-to-seo
Regarding Bing/Yahoo, Bing has a 10 year agreement to control Yahoo's search results so they are basically the same company from a search result perspective. They have different ownership then Google and operate differently. If you have an outstanding site, then your results may be aligned but otherwise there is often differences.
What keyword are you trying to rank for? A best practice is to focus a single keyword per page. Your home page title is: Roof Installation, Roofing Contractor, Siding Contractor Chicago, IL. Apparently you are trying to rank for multiple words which will not yield the best results.
Another sign of your keyword focus is your H1 tag which is missing on your home page.
When I search Google for "All American Exterior Solutions" you rank #1. If you want to rank for "roof installation" there is a lot of SEO work to perform. Your home page only uses the term a single time in content and it is not in the first sentence so it does not appear to be the focus of the page. You also have another page of your site which is better optimized for the term: http://www.aaexs.com/residential/roofing
Also, your footer is incredibly spammy.
Google desires to return relevant, quality pages in their search results. You may be a great company but your web pages need to improve in order to rank in Google for the terms related to your business.
Would I be ruining my SEO work if I begin to publish blog posts for the same keywords that my content pages target? Am I basically forced to find alternative keywords and only target one page per keyword?
In short, yes.
When Google provides search results they need to search trillions of pages to determine which result is most likely to satisfy a user's query. One of the key components of their algorithm is relevancy. If you have a page titled "chocolate ice cream" and then a blog article with the same title, which result should be returned to a user who searches in Google for "chocolate ice cream"?
If you offer multiple pages with the same keyword focus you run into an issue called cannibalization. You can solve that issue by narrowing the focus of one of the pages. For example, the main page on your site is what I would refer to as "evergreen" content. 10 years from now someone can read that page and the information is likely still valid. Your blog often offers fresh content which is more time sensitive. Some possible topics for an article:
Top 10 Chocolate Ice Creams in the world
Lowest Calorie Chocolate Ice Cream
Chocolate Ice Cream Recipes
I would also recommend being very careful when providing content on two similar keywords. It takes a level of expertise to do it in such a way that it adds value to your site. One helpful step is to use anchor text. If you write an article on "Chocolate Ice Cream Recipes" then one time in the article when you refer to "Chocolate Ice Cream" present it as an anchor link to your main page.
Roughly once per month. There are several factors which affect the schedule so it seems to vary. Here is a link to the schedule: https://seomoz.zendesk.com/entries/345964-linkscape-update-schedule
It just updated on May 31st and the next update in scheduled for June 27th.
** How do I tell Roger no to crawl these blank pages?**
Any easy solution is to block roger in robots.txt
User-agent: rogerbot
Disallow: [enter pages you do not wish to be crawled]
But a better solution would be to fix the root problem. If your only goal is to provide clean reporting to your client the above will work. If your goal is to ensure your site is crawled correctly by Google/Bing, then Jake's suggestion will work. You can help Google and Bing understand your site by telling them how to handle parameters.
I would prefer to fix the root issue though. Do the pages which are being reported as duplicate content have the "noindex" tag on them? If so, you can report the issue to the moz help desk (help@seomoz.org) so they can investigate the problem.
Jarin,
I cannot specifically answer why your site's PR is not available. There are numerous possibilities including a glitch in the latest data feed which offers PR. I was able to verify using a couple tools the PR for your website is not available.
I understand your concern. Penalized websites will show the same results you are seeing, a PR of n/a or a grey bar. Since you are ranking #1 for numerous keywords it seems clear you are not penalized.
I do not look at the PR for any of my sites. It has no tangible value. You can choose to agree or disagree, but I have worked with many sites and helped them earn top rankings without looking at PR at all. If you don't wish to take my advice, perhaps you will accept the same advice directly from Google.
In 2009, Google discontinued the practice of updated PR and now they only offer the information about 3-4 times per year. Susan Moskwa from Google wrote:
"the PR you see publicly is different from the number our algorithm actually uses for ranking. Why bother with a number that’s at best three steps removed from your actual goal, when you could instead directly measure what you want to achieve? "
For further details read the full article: http://googlewebmastercentral.blogspot.com/2011/06/beyond-pagerank-graduating-to.html
A great tool I use is SEMrush: http://www.semrush.com
You enter a website and it will provide you a thorough list of keywords the site ranks for along with their position in SERPs along with other details.
Here is a screenshot of the SEMrush results for SEOmoz. I was surprised to see SEOmoz rank for "slickdeals" but it seems in 2005 Rand wrote a few words on that topic so it is a legitimate result.
http://www.vitopian.com/shared/semrush-competitive-research.png
If I want to target a keyword phrase to a particular phrase, but do not want to change the URL of that page, will that negatively impact my rankings?
A better way to say it is you are missing an opportunity to make a change which can positively impact your rankings.
A page's URL has a very minor effect on rankings. There are two larger secondary effects. First, if you offer a clear, relevant URL your click-through-rate may increase. Additionally, when others link to your page by copying and pasting the URL, you will naturally have good anchor text which is very helpful, especially in our post-penguin world.
Stefan correct shared there is not a reason to create a new short URL.
Can you share what changes have been made to the site? A few ways this can happen are:
a change to the robots.txt file
a change to your site's template either removing a canonical tag, a noindex tag, or altering your pagination in any way such as modifying paginated titles
resolving an onsite issue which prevented crawling of these pages
SEO Analyst working for a quality company in Rancho Cordova, CA. Focused on implementing SEO best practices.