It is a temporary glitch caused by Google's domain clustering. It should be back to normal by next week. Matt Cutts confirmed this. Irritating as hell though.
Posts made by IPROdigital
-
RE: Google site: operator showing only 30 results for whatever website you may like, omitting the rest
-
RE: How to find and fix 404 and broken links?
I think you need to post the URL! Have you tried the links from every page? Although personally I would use an include file to reduce errors and maintenance time, but not sure if you are...
-
RE: Is this caused by EMD update or other reason?
Yes - one position change may not be enough to draw conclusions. But "To clarify US site is more popular, has about 6 times more external links than UK, Domain, page authority and all other SEO things are higher." probably is relevant. Also, is there duplicate content (just a thought I had)?
-
RE: Is something strange going on with Google last few days?? Seeing strange results for competitive keywords
They probably won't last. People will go from the Google search results pages to these sites and if they offer nothing of use return to the search results page. Google data will now show high bounce rate which will affect their ranking. It's a genuine indicator of usefulness IMO.
-
RE: Some inbound links which i have removed long ago, still showing in GA and in Open Site Explore, how do i remove them from their?
<META NAME="robots" CONTENT="noindex,nofollow"> will stop Google from indexing and crawling unless other pages link to the page in question.
-
RE: Pages crawled is only 23 even after 8 days??
It can take far longer to crawl especially for new sites which don't have much domain authority and aren't already regularly updated. If you need to type something to access the page, then how would the bot access it? From a Google perspective, same thing. I would use sitemaps.xml for better crawling. See Google Webmasters website for more at http://www.google.co.uk/url?sa=t&rct=j&q=google sitemaps &source=web&cd=1&cad=rja&ved=0CCwQFjAA&url=http%3A%2F%2Fsupport.google.com%2Fwebmasters%2Fbin%2Fanswer.py%3Fhl%3Den%26answer%3D156184&ei=MLaKULeqL4qq0QW-5YCwAQ&usg=AFQjCNFLAzVLITWiQXaivNnwpVv8Fxbatg
-
RE: How to avoid duplicate content penalty when our content is posted on other sites too ?
If you have access rights to add rel=canonical to the other sites, then use them, but I'm guessing you don't. I would post them on your own site first, wait, do a search e.g. site:example.com/post-url and only post to other sites once I know Google has indexed the article on your site first. Otherwise it might assume that you've taken the content from the other site which may incur a duplicate content penalty.
-
RE: How to avoid duplicate content penalty when our content is posted on other sites too ?
If you have access rights to add rel=canonical to the other sites, then use them, but I'm guessing you don't. I would post them on your own site first, wait, do a search e.g. site:example.com/post-url and only post to other sites once I know Google has indexed the article on your site first. Otherwise it might assume that you've taken the content from the other site which may incur a duplicate content penalty.
-
RE: Adding Meta Languange tag to xhtml site - coding help needed
At least with Google, I doubt it makes a difference unless there are multiple languages on a page. If you use Chrome you'll see it auto-detects the language and offers to translate. It may only rank the page in a specific country or locale though. If you're aiming at Spanish speakers in the UK, it may be a little different.
-
RE: Https enabled site with seo issues
That depends on what you define as a problem.
If you use relative URLs in hyperlinks in your website and Google enters at an https page then it will assume that other pages crawled to are also https. Thus https pages will be indexed instead of http. This can cause problems in the long run. I would set links in https pages as http absolute links e.g. in the sign in page.
-
RE: Permalinks best structure
You might get a slightly faster page load if you make custom .htaccess redirects, which should be possible for you. I'd go for page publishing. I would not have one link after another to get to a page. Minimize clicks to destination for best UX. If you're writing for a niche topic, then yes URL alone and presence of relevant keywords may do it for you but otherwise they are only part of the strategy.
-
RE: Does Google index has expiration?
I believe they will still be indexed but links leading to them will probably be useful if you want to get a decent page rank.
-
RE: Having pages in the footer
I'll agree with Kade and add something. Think about user experience and bounce rate. Let's say you achieve the ranking for the desired keyphrase and you have a title and description compelling enough for the user to click through, if the user goes back to the Google search results page too quickly that indicates to Google that it may not be relevant of poor quality. So while you target those keywords, you should be genuine about it. If you're referring to doorway pages, avoid them. One of Brazil's largest travel sites got their rankings hammered for doorway pages not long back.
-
RE: Steady Traffic Decline over a 1 year period
Yep. You haven't mentioned competitor analysis or your URL which are the first things that come to mind. Feel free to add them
Egol has given a great response -
RE: Geo-specific SERP Rank Tracker that is good for hyper local results?
At this stage in time, I would go for adding local area names to search phrases (without duplicating content of course).
-
RE: Geo-specific SERP Rank Tracker that is good for hyper local results?
This is a tricky one. It involves IP address location accuracy which is far from perfected (although it has improved in recent years). If you do find a tool, I'd recommend questioning how exactly it manages to determine exact locations and I'd be very skeptical. Say my ISP's data centre is miles from my house, that's where my location would be pinpointed. It's probably unlikely my ISP would pass data to a third party (privacy laws) as to how far I live from the data centre. Otherwise hackers and criminals could also make use of that information. I guess I would not pay much if anything for a service which claims to do this. For mobile internet with GPS tracking it's a different matter, but I don't know what your purpose it. Mobile users are supposed to be of far less value than computer users (as you know many are not prepared to pay for keywords on PPC search what they would for PPC on laptop search - relate this to the latest share price drop in Google).

-
RE: Would you still consider the statement in 2008 about link attribution a good strategy? http://www.seomoz.org/blog/headsmacking-tip-7-enforce-link-attribution-for-your-work
Yep - I think it's good that you have links back to your website. They say aggressive exact match anchor text is not recommended but when I designed and developed, there was a limit to how many sites I put out, so I don't think that would have been going overboard.
From what I know, it sounds like an honest approach. Good luck

-
RE: How to Remove Joomla Canonical and Duplicate Page Content
View page source shows HTML. If I can't see the PHP file i.e. which generates the HTML it's impossible for me to know how. I'm not very clear on this. You can PM the file to me if you want - but don't send me any passwords please.

-
RE: Choosing an SEO Company
Wow. What a nice guy to provide all that useful info!
-
RE: Better click through rate in 2nd position?
Perhaps you have a better description or more compelling title based on the searches? Not impossible. That's why I tell clients that even if they don't rank above competitors but are on the visible part of Page 1 with a more compelling title and description that may be enough to get the visitor to click through.