Questions
-
Fetch googlebot for sites you don't own?
Another thing you can do, when you know another site has something that relates to you, is to bookmark that external page, using a site that the search engines visit regularly. For example, If that external page is new and it is a useful page about an iphone App, you could submit it to Digg. That should get the attention of other diggers and the search engines. You would be boosting the power of the other site and the link to your site, and getting the search engines to find it. I don't do that often, but I have done it a few times, and I think it works.
Moz Pro | | loopyal0 -
Duplicate page content help
Most of the 1000's of errors in google crawl and seomoz crawl are like this for my site. In my case its just an issue that the url can be built and retrieve the same seeming data no matter what a part of that url is. I just hope that I dont get penalized for it. Its almost impossible to stop this from happening it seems. There are not more than 1 page but the crawlers don't see it that way. To date this is my biggest fear.
Moz Pro | | landed0 -
Http://lsapi.seomoz.com pop up
Things should be working now. On Friday OSE and Linkscape (and lsapi) were down, but that came back about 5pm Pacific time on Friday. If re-adding the extension doesn't work, it's time to email the help desk at help@seomoz.org.
Moz Tools | | KeriMorgret0 -
4XX comments/feed/
You should investigate all errors, but it sounds like you have done such and determined the root cause in this case. Since you are aware it is your RSS feed, I would ignore the error.
On-Page / Site Optimization | | RyanKent0 -
Top ranking with no SEO
One more example related to exact match domains. http://www.lakai.lt no seo for a 3 years,(about 1000 monthly visitors) added google analytics yesterday to catch 404 , and submited sitemap. Competitors are way too good for this kind of site, but still. For most competitive keywoards the site is on first SERP. I think stable number of visitors for a years does the job, supporting such an unfriendly to google robot website.
Moz Tools | | rokauskas0 -
Schedule crawls for 2 subdomains every 24 hours
Generally, I correct the error and then wait until the next crawl if I'm confident that I've fixed the problem. If I'm unsure then I may run a manual crawl using the crawling tool, however for me the format just isn't as easy to read as the weekly crawl diagnostics.
Moz Pro | | PeterAlexLeigh0