Keep in mind that a 301 transfers most PR, but not necessarily all. Also, notifying Google doesn't mean it was crawled right away. It's the crawler, followed by the indexation process, that determines where a site will rank.
Best posts made by Highland
-
RE: My ranking got down from 4 to 9 in just 2 days
-
RE: Optimum level of link building per month for a new domain
The correct answer is... as many as you can afford the time to build correctly
From the Moz link building guide
There are lots and lots of ways to get links. The right tactics for you depend on the resources you have at your disposal as well as the industry that you're in. Industries that are more established and competitive often require you to be quite aggressive with link building, and you might find earning those links more difficult. Other industries, often the newer industries that are quickly growing, are full of opportunities to engage with bloggers and build a community.
But the key here is to BUILD VALUE. From Rand's recent Whiteboard Friday
I would urge you to go the opposite direction. Narrow your funnel. Worry less about the number of people you're targeting and more about the success rate, because once you get the success rate high, you can turn up the volume really fast. But if your success rate is low and there's a limited market of influencers in your field, you can quickly burn all of them with your outreach before you ever have a chance to get good at it.
The simple fact is that your question is, more or less, "How many low quality links can I safely build?" and the answer there is none. I could easily go get 80 links... and then those links would get devalued or draw a Penguin penalty. Instead, you need to build quality links, and that means you have to take time to build some value with what you're offering to them. To sum up that Whiteboard Friday "There are no shortcuts to build quality links". You have to do it low and slow. If someone is willing to give you a link for little or nothing then that link isn't work building.
-
RE: Should I let my Apache server compress automatically site information?
Compression (gzip specifically) only affects bandwidth consumption. The user agent has to support decompressing the data for it to work so there's no affect on any user agent that doesn't support it. The only downside is that compression does require more work on the part of your web server so if your website is slow it might make it a bit slower.
Website: My apologies for scrutinizing your style, Googlebot, but I noticed your Accept-Encoding headers say:
Accept-Encoding: gzip,deflateCan you explain these headers to me?
Googlebot: Sure. All major search engines and web browsers support gzip compression for content to save bandwidth. Other entries that you might see here include "x-gzip" (the same as "gzip"), "deflate" (which we also support), and "identity" (none).
-
RE: Is there a limit for 301 redirection in htaccess file?
The simple answer is no. Let's say you have page A and 301 it to page B. As soon as Googlebot gets the 301 (might take a bit longer but for argument's sake we'll say it's instant) it drops page A and indexes page B. As the ranking process moves along (which is slower than indexation) most, if not all, of the PR that A had is now moved to B.
Google can't read your htaccess file (Apache will not show any page with a period as the first character in the filename for security reasons) and I've not heard of any long term penalties from changing URLs. Just realize that there is a gap (days or weeks) between the time that Google notes A moved and B is ranked like A was.
-
RE: Making a whole site website SSL (https)
https is treated just like a subdomain is. So your http site will be seen differently. 301s will be necessary.
SSL also adds some overhead and requires a bit more bandwidth (there's more negotiation involved). If your site is slow, this could make it slower.
-
RE: Trying a completely new design on our .co.uk should I have it on a different IP to my .com
You can but you don't have to. There is a small benefit to having a ccTLD on an IP that has a reverse DNS to that country. The main reason to have a different IP, however, is different demographics. If your .com is aimed at the US market then it needs to be hosted in the US because it lowers latency. Same with your UK market, your Austrailian market, etc. Lower latency is better. That doesn't mean you HAVE to do it. You can host all of it in the US and still rank well. It just doesn't have those extra signals so it's a bit more work, but not greatly more-so than setting up these different IPs.
-
RE: How best to handle (legitimate) duplicate content?
There's not too many options here. Geotargeted (even locally) tends to produce duplicate content. The only option, really, is to canonical all your products to one place. If you do it right, you might be able to rank all three sites for your keyword.
You can try #1 but, as you said, it's hard to restate the same content in a non-duplicated way.
-
RE: Client has franchisees on separate sub domains
I think you're right. it doesn't sound like there's a solid case here for subdomains. Consolidation would make for easier marketing as well.
-
RE: Penguin/Panda/Domain Purchase
Run a report on open site explorer. See what links they have. Look for things that could be harmful. There's no easy way to undo links but many webmasters will gladly unlink you if you ask.
That having been said, if the domain name was really that good, I'd find a way to make it work (especially if the domain is easy to remember). A good domain name is hard to come by these days and there are major reasons beyond SEO to buy one.
-
RE: How about this new Q&A Design?
Nice new (and cleaner) interface but still lacking a few things. A major one for me is I still can't search Q&A posts I've made, only blog posts and comments. There is a nifty new mozpoints graph, though.
-
RE: Will Google read my page title and H1?
I assume this is ASP or .NET. When your page is rendered, it will appear as a normal HTML. Google doesn't hate HTML.
-
RE: Untrusted site - malware!
These pages or their domains should be disavowed to remove them from the Google index.
The disavow tool can't affect whether or not your site is indexed (or I could disavow all my competitors). Disavow is a hint to Google that tells Google "Please don't count this link towards page rank". In other words, disavow, more or less, lets you reach into another person's site and nofollow the link to you.
So, should you use it? Unless you have WMT messages telling you you have a manual penalty, don't touch it. Most likely the links in question are already devalued. If you use it, you could disavow links actually helping you and thus do yourself harm.
MALWARE!!!!
Unless there's malware on your site, links from sites that have malware on them won't hurt you (simply because they have malware)
-
RE: Malicious site pointed A-Record to my IP, Google Indexed
That sounds like a bad web server config. Most servers run a virtual host, meaning the URL determines what website is served up. Either you have your own virtual dedicated server and only one site that isn't using vhost, or your host has set your website up as the default site.
If you have control over the web server config, I would add the malicious site to the config as a hosted site and then have it return a 404. That should de-index it.
If you don't have that level of control, try to get a 301 redirect for the bad domain. You really need something like an htaccess that says if a site is accessing my website as anything but www.mydomain.com it needs to 301 to that URL. Otherwise anyone in the world can hijack your site the way it's set up now. Just point another A record and instant duplicate content headaches.
-
RE: On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
Wait, you just saw a bunch of links and disavowed them because... you saw a bunch of links? Did you have any penalties or rank drops? Anything that would lead you to believe these links are actively harming your rankings?
Disavow is a cleanup tool, not a preventative tool. I mean, there's a really good reason why they tell you NOT to use this tool lightly. If i could put this on a giant neon sign I would, but here's what Google says on their help page (emphasis mine)
This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.
If I were you, I'd yank the disavow out right now
-
RE: Blog tags are creating excessive duplicate content...should we use rel canonicals or 301 redirects?
Canonical hands down. This is what canonical was made for anyways: duplicate content you can't remove.
Canonical simply lets you tell Google which duplicate content should "win" the indexation race and Google will take it into consideration. I can think of many reasons why you'd have overlapping tags but would not want to remove them (which is what a 301 would do)
-
RE: GTLD for SEO?
The only TLDs that get any known different treatment is the ccTLD (which sends a geolocation signal).
The reason non-"standard" TLDs don't show up very often is that most people buy the common TLDs. Most people still associate the ".com" with websites because they are the oldest TLD. I personally own a ".info" for my email (not a new TLD either) and I typically have to spell it out to people (I occasionally get some strange looks handing it out verbally). I can only imagine the looks if I had a more exotic TLD ("Yes, my email is ralph@crazy.ninja... no that's really it"). So other TLDs are basically less popular and less well understood, which explains why they aren't used very often.
The arguments over the years that I've seen that Google is flat out lying about the TLDs being treated equally have never really withstood scrutiny. There has never been anyone who has made a given TLD rank higher solely because of the TLD. The cases I've seen had other factors that could just as easily explain a ranking difference. And an objective test would be very difficult to create, as you would need two identical sites, and from an SEO perspective that's super hard to pull off.
Ultimately, tho, why would Google lie about this? What's there to be gained in telling people it's not a ranking factor when it is?
-
RE: Sorry Posting again.... Still not clear on my question.
Generally speaking, no. You're using ccTLDs and it's generally understood by Google that each one is targeted to a specific region. Furthermore, each of the countries you're targeting are all English so localization is not as critical. You could try to differentiate the language (Canadian English is different from Australian English, etc) but it's not necessary (although you would probably want to do this on French Canadian pages, if applicable).
If it's really, really eating at you that much, host each site in the country it's targeting (not necessary but it helps a bit). Other than this, watch this video by Matt Cutts and be happy.
-
RE: Linking Building: Do I have to beat Linking Root Domains or Total Links?
Neither.
The aim is to build link popularity and, while number of linking domains contributes to this, it really matters more the quality of the sites giving you the links than the total number. That's a major reason why the Seomoz tool is important: it measures not only the number of links, but the power of those links. At present, however, I wouldn't totally trust the Moz numbers (Penguin has affected this in ways we still don't understand) but that can be a good gauge of where you stand.
Quality beats quantity. That's the gist of Panda and Penguin.
-
RE: Confused About Problems Regarding Adding an SSL
One major tip I always point people to is that using protocol-less links for anything external is a great way to make sure your site always supports SSL without issue.
Firebug is a great way to make sure everything is loading HTTPS. Turn it on, switch to the Net tab, and load your page. It will show you every request sent as part of your page. It makes spotting non-SSL requests easy.
You can turn HSTS on yourself if your provider uses Apache and supports htaccess. (sorry I can't link an article, Moz won't let me). If they don't, you will have to have your host enable it on their end.
-
RE: How many times have you reauthorized your facebook account for Moz analytics?
I doubt it's a Moz error. Having had to deal with the API on the other end, sometimes FB will lose authorization on its own. My wife plays some games that are in app form and those apps will ask you, sometimes 2-3 times a week, to authorize FB again.