site explorer does not crawl the whole internet, could it be that the 2000 are low quality links from obscure sites. I think SE tries to get the best of the web.
Posts made by AlanMosley
-
RE: No back links showing in site explorer but..
-
RE: Is Syndicated (Duplicate) Content considered Fresh Content?
Well that could mean that some don't need any.
Like
Q. Who discovered Australia, A. Captain Cook.
This does not need freshness.Also consider being original content, in that case the timestamp being older would be better.
I like to think that I own google, and say to myself would I rank it? of cause some things may rank that were not intended to, but I think its quite safe to think that way.
-
RE: Is Syndicated (Duplicate) Content considered Fresh Content?
Had a quick look at that page, did not see that it affects all pages. Anyhow google said 35% of queries, so could not be all pages.
Some points- Why would fresh data be excluded from duplicate content?
- Is it likely that syndicated data is fresh?
- What are google trying to do here, rank syndicated duplicate data?
I cant see it working
-
RE: Is Syndicated (Duplicate) Content considered Fresh Content?
Yes, freshness update was not for all queries, it was for certain queries that need fresh content such as football scores, or whose on the team this week, obviously we don't want the score from last year or who is playing last year we want the current data, that is where the freshness update may give you a boost while your content is fresh. Having syndicated content I cant see falling into this category, even if it did, being duplicate content would mean that only once source is going to rank.
Also you have to look at indexing, will the duplicate content even be indexed? if so how often.
That's why I say the short answer is no.
-
RE: Fetch data for users with ajax but show it without ajax for Google
1 second is a lot for a few links even a lot of links. maybe your server technology has problems.
But still you have the problem of load time no matter who you are downloading the links for, the search engine or the use, they still have to be downloaded.
-
RE: Is my website is having enough content on it to rank?
Yes I would, I would overlay those images with text, and mark them up using good html5 semantics and schema.org.
don't make it hard for the search engine to work out what you page is trying to do. describe your data well so that the search engine knows just when your trying to say.
-
RE: Keeping SEO benefit of an old URL by changing content
There is a risk, but if the content is relevant its a small one. you could rank better also.
-
RE: Why has google stopped showing one domain and switching to showing another that points to the same website?
I agree with Patrick and dirk, you making trouble for yourself.
Decide what domain you want and 301 redirect to it. Use the logic, if not prefereddomain, then perfereddomain, rather then trying to detect evey domain.
-
RE: Fetch data for users with ajax but show it without ajax for Google
if you are going to load data for google on page load, then you will still have the load times. so loading links again using ajax is not solving anything.
-
RE: Robot.txt file issue on wordpress site.
What the problem is with your site I cant say because we don't have the url.
But having said that, Why block pages? I find it is very rare that you need to block pages, I remember reading that Google sees blocked resources as a spam signal, that by itself is not a problem but when mixed with other signals can be harmful.
so do you really need to block the pages?
if you really do, then use meta noindex, follow.
you want the links to be followed so that link juice flows back out of the links to indexed pages. or you will be pouring link juice away via any links pointing to noindex pages -
RE: How many serp results for a domain.
Google likes to share the results between domains, but that is only one consideration, all other things being equal you would expect to get a mix of domains, but if the pages win out in other ways then there is nothing to stop them ranking
-
RE: What is the SEO Effect of Removing Old 301 Redirects from Domain A.com -> B.com on B.com?
I don't think so.
A long time ago in first days of the internet websites were no more widley used than other protocols such as ftp or telnet. a domain may have sub domains such as ftp.example.com, telnet.example.com or www<a>.example.com</a>. but example.com itself would not be used, since then websites are the main protocole used and people started to make example.com respond to websites. so really we don't need www anymore but best to redirect it just in case
-
RE: What is the SEO Effect of Removing Old 301 Redirects from Domain A.com -> B.com on B.com?
Its a matter of choice.
I prefer example.com, if your web site answers to both www.example.com and example.com then why not use the most simple example.com
-
RE: What is the SEO Effect of Removing Old 301 Redirects from Domain A.com -> B.com on B.com?
It will take a few weeks to sort out, not a problem.
but if you have any links to the non www version, then they will now be redirected, each redirect leaks link juice
-
RE: Why Google Cache is not showing ?
Just in case you cant see page thois is the text
SORRY! If you are the owner of this website, please contact your hosting provider: webmaster@wwww.tcclinic.com
It is possible you have reached this page because:
The IP address has changed.The IP address for this domain may have changed recently. Check your DNS settings to verify that the domain is set up correctly. It may take 8-24 hours for DNS changes to propagate.
There has been a server misconfiguration.Please verify that your hosting provider has the correct IP address configured for your Apache settings and DNS records. A restart of Apache may be required for new settings to take affect.
The site may have been moved to a different server.The URL for this domain may have changed or the hosting provider may have moved the account to a different server.
-
RE: Why Google Cache is not showing ?
I just went to your site it was down.
-
RE: Risk Using "Nofollow" tag
If you use nofollow, then every link pointing to those pages will throw away their link juice, you don't want that.
Follow means that link juice will flow though the links back to your indexed pages. Telling google not to index is doing them a favour as they don't want duplicates I don't think there any concern.