Hi,
According to this article by Moz on hreflang, yes, having an hreflang tag with the language only will help you cast your net out to English speaking searchers from other regions.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi,
According to this article by Moz on hreflang, yes, having an hreflang tag with the language only will help you cast your net out to English speaking searchers from other regions.
Hi,
This is part of Google's webmaster guidelines as of 2014. Allowing Googlebot to access your CSS and JS files helps them understand how your site works and verify that it works properly. It's part of their effort to provide better results, not only in content, but in usability as well.
Eric said basically what my next recommendation would've been. Building a new site on a new domain is not the way to handle a penalty from Google.
Having 2 websites confusing for people; i.e. people who find both of your sites through a series of refined Google searches. Additionally, it's not likely to solve your problem. Unless you moved your physical location AND you re-branded your company, Google almost certainly knows that your new site is an offshoot of your old site. Therefore, your new site is fighting an uphill battle, competing against its older self, and with Google's preconceived notions of what you're trying to accomplish.
Hi,
After reading your post, I'm wondering why the old domain hasn't already been 301 redirected to the new site? That would really be the ideal situation for not having your two sites compete against each other. You'd want to do this at the page level, so equivalent pages on the old site get redirected to the most relevant page on the new site.
For example: www.oldsite.com/calculator redirects to www.newsite.com/calculator
Hi,
You should definitely switch those tags over to rel prev/next. These tags are designed specifically for this type of content, so there's really no reason to use another tag. By having these in place, you'll be giving Google a better picture of your site's architecture, and therefore a more accurate representation in the SERPs. You may see the subsequent pages in SERPs, but they'll show below the mother page.
As with most things in SEO, it depends. Depends on the search volume of the keyword, how many sites you're competing with, how many of those sites are well optimized, how many links your page/site have, how authoritative/trustworthy your site is, and so on. There are a lot of variables in the mix here, so it's not feasible to provide a numerical answer.
Hi Neil,
If you've got two pages targeting the same keyword, they're going to compete against each other and you likely won't see either of them too high up in SERPs. I recommend canonicalizing one towards the other. Keep the one that is stronger in terms of URL structure, PA, and links (if any).
Hi Ramendra,
To my knowledge, you can only provide directives in the robots.txt file for the domain on which it lives. This goes for both http/https and www/non-www versions of domains. This is why it's important to handle all preferred domain formatting with redirects, that point to your canonicalized version. So if you want http://www to index, all other versions redirect to that.
There might be a work around of some sort, but honestly, what I described above with redirection towards preferred versions is the direction you should take. Then you can manage one robots.txt file and your indexing will align with what you want better.
THANK YOU!
This has been driving me nuts! I really don't understand why people keep calling it a 'factor'. It's a tool for determining SERPs, NOT a factor!
Factors are things that we, as webmasters/SEOs/whatevers can control. We obviously have no control over RankBrain.
I actually have come across a handful URLs that are NoIndex, I'll DM you a list once complete.
I can't be certain this is the root of the problem (I've never seen this error in the crawl report), but based on the error you said you're getting, I believe it's a great starting point.
Hi,
This sounds like it's more related to the meta robots tag, not the robots.txt file.
Try this:
Commenting so I can see any response from a Mozzer, as I'm still waiting on my 500 pt. shirt.
Hi Rena,
You technically can do that, but it's not recommended - for the exact reason you state above. More often than not, 2 sites aren't going to have the same set of disallow rules.
Additionally, you should also be using robots.txt files to direct search engines to your XML sitemap, and if you're sharing a robots file, then you can't specify 2 different sitemaps on 2 different domains.
Hi Josh,
I think you nailed it with your 3 bullet points. There is absolutely no way to guarantee any link worth having, so I don't think you're missing anything and I don't believe there is another way to look at it that justifies making promises that can't be kept.
The best way to mitigate this problem would be to update the destination URLs in your Adwords Campaigns. You can do this in bulk relatively quickly using the Adwords Editor desktop application.
Hi Steven,
You'll definitely want to apply 301 redirects to any site that you move to HTTPS. For most sites, this can typically be done with a single redirect rules that essentially replaces http with https, so you won't have to comb through each URL and apply one-to-one redirects.
No need to worry about losing link juice, Google views these types of 301s differently than a typical 301, and all authority will pass through them.
Canonical should also be applied, this will help search engines learn your new URL structure and ensure they index the new HTTPS URLs.
Cryus Shepard wrote a great post with all the necessary steps for a secure migration, check it out here: https://moz.com/blog/seo-tips-https-ssl
Good luck!
Hi Kurt,
I think it depends on the likelihood of your readership's interest in reading content on both topics. If you think there's value (and likelihood of interest) in people being able to easily access content on both sides, then go with one blog and categorize. If you think there is little-to-no chance of a research blog reader being interested in the fundraising content, and vice versa, then go with two separate blogs.
A side note, since you're concerned with SEO - while looking at your site, I noticed a lot of your meta descriptions are duplicates, and your homepage is duplicated across two different URLs. Run this search for details: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site%3Acentiment.co
Hi,
Google will count this as duplicate content, regardless of 'affiliate' status or not. You've got a few options here:
I think option 3 is the most viable, since you won't have to ask the affiliates to make any operational changes.