here is a couple i use
- GsiteCrawler http://gsitecrawler.com/
- ScreamingFrog http://www.screamingfrog.co.uk/seo-spider/
Hope it helps
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
here is a couple i use
Hope it helps
Hi Sara,
The most important thing you need to ad the GA code is:
_setDomainName
and you reference the root domain for example
hope it helps
Pbhatt
Think about crawl budget allocation that google gave to your website.
by blocking search pages from your robots.txt, your saving alot of budget to crawl and recrawl your important pages. and on the other hand, some instances where search pages might provide indexable URLs with Thin content and your site might be a Panda candidate.
So there is no potential you did a good job.
The position provided by SEOmoz is through a query in a unpersonalized search + using a specific tld.(.com/ca/uk)
What Google provide in the SEO reports in GA is an average of one query or multiple queries also and based on users which they may have personalized results on or off ( and this might affect your rankings )
hence the difference.
hope this helps
Hmmm pretty interesting.
can you go over here http://www.seomoz.org/google-algorithm-change and at the same time go to your Google Analytics and Match the day that your traffic dropped with the dates of these algo changes and see if anything matches.
awaiting your reply
well the simple answer for you is Google allocate a crawl budget based on multiple factors.
with your current setting the crawlers and wondering and going after these search page that add no value to the web. and losing alot of your budget on these search pages, where i would definitely direct these crawlers to crawl the content and update it whenever u add or update a page.
"Actually, if you access with a mobile device userAgent to any desktop url you are redirected to the home of the mobile web. This is the only redirect implemented about mobile and desktop versions."
any URL from the desktop will be redirected to the homepage of the mobile site? if yes then you need to fix that ..
best practice is domain.com/sample.html to m.domain.com/sample.html ... if pages dont exist yet .. leave it alone and dont add any redirects
A) dont provide any "special" treatments to G Mobi bot . let it crawl your desktop and ur mobile version.
B) the better solution is to create a mobile sitemap and submit in in GWT m.domain.com/sitemap.xml http://support.google.com/webmasters/bin/answer.py?hl=en&answer=34648&topic=8493&ctx=topic
C) because it resides in a subdomain you will need to do additional work
Hi Mariano
A)As you mentioned Google supports up to 50,000 URLs in a Sitemap, This limit applies only to the number of URLs referenced specifically in <loc>tags.</loc>
URLs referenced as hreflang don't count towards the 50k URLs.
B) Yes try to implement the sitemaps as they are different domains.
Please note that the optimal solution is you need to appear in the Map Listings AND also in the Organic listing.
not everytime someone search for a lawyer in sacremento will trigger the local map.
to increase you rankings in Google+ Local you need more citations (here or sign up with Local Citation Finder by Darren)
and increase your review count.
if they have links pointing to them i would definitely add them to your server then add a .htaccess to 301 redirect them to the main site
No one mentioned OpenCart yet? wow . 
Two problems you are experiencing and there is a common thing happening here.
your deleting/not implementing the "old" redirects.
as you know when you redirect page A... as long as there is a website on that domain. you have to always have that redirect. there is no threshold where if you implement a 301 redirect to a page for after a year for example there is no need for it.
This is a good video i posted in the references from matt cutts.
you can concluded "in a way" a 301 redirect will never be deleted especially "when Goog start receiving mix signals about this domain"
so the solution to your problem is you need to implement these old redirect you had before
hope this helps
I would definitely use schema.org
Google and other search are trying to move from rdfa and harmonizing everything to be under schema.org
It does not hurt the site directly, these headers can help you optimize your website speed by allowing the "Expires" tag to register to a future time so the browser can cache and make your site faster to load and navigate. So it depends on your niche and competitors. if all variables are equal and the only difference between the both site is speed. than you better optimize your sites speed.
one single redirect means
Old page.html ----> new page.html
multiple "chain" redirect
oldpage1.html--->oldpage2.html--->oldpage3.html-->newpage.html
In your pass experience,have you seen that practice causing performance issue?
one single redirect No (its negligable)
Also what would you think about that answer I have found in one of the Q&A: (4) Be careful with a massive number of 301s. I would not 301 100s of pages at once. There's some evidence Google may view this as aggressive PR sculpting and devalue those 301s. In that case, I'd 301 selectively (based on page authority and back-links) and 404 the rest.' What do you think about that?
It never happened to me, we always had multiple experience renovating sites and implementing 301 (from small to 10000 page site) and no problem.
Hello Nancy,
hmmm no
let me explain why
true your inner pages have no inbound links to them but do remember that you have links pointing to your homepage.
That authority you have on you homepage are passed to your inner pages, either threw navigational links or contextual.
Currently, each page you have on your site have an actual Page authority that got inherited from the your homepage
so implementing a good 301 from page to page will partially preserver this authority and of course your rankings you have right now with each page.
My Past experience finds that rel canonical will transfer faster. than a 301
put two sites online.
Site A add rel canonical from page to page, this will help you in the transition faster.
Hi Anthony,
Are we talking 2 different domain names?
are the content from site a to site b are the same ?
Usually crawlers when they come to your website and notice the 301 they will keep visiting your site and they will swap the urls.
If you are moving to a different domain, you might see some drop in rankings due to the new domain low authority and it takes time for the main authority to move from one domain to another.
the higher the authority of site A the higher this will be a faster transition with out any interruption in rankings..
so in a summary if you are moving from a domain to a new domain, or the new site have some changes in content, i would wait till after the holidays(if you benefit from the holidays)