Hi there.
It looks like you need to read this: https://moz.com/beginners-guide-to-link-building
And couple more articles from here: http://bfy.tw/1cUh
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi there.
It looks like you need to read this: https://moz.com/beginners-guide-to-link-building
And couple more articles from here: http://bfy.tw/1cUh
"B" option is way to go.
Just optimize and use those terms in natural way, don't keyword stuff and you will be fine.
Hi there.
Well, let's go step by step:
From SEO perspective It doesn't matter what version you link to, as long as you have proper 301 redirect from one version to another.
RewriteCond %{http_host} ^your-domain.com [nc]
RewriteRule ^(.*)$ http://www.your-domain.com/$1 [r=301,nc]
This will assure that you wouldn't have duplicate content and will give Crawler Bots understanding what version of your website is "canonical". Just remember that "www" is technically subdomain. At the same time you can set preffered version of your website in Google Webmaster Tools. This will "tell" google how to display your website in Search Results.
https://support.google.com/webmasters/answer/44231?hl=en
If you have added new version (http://, www, non-www, https://) of your website to GMT it will require some time to gather all information.
https://support.google.com/webmasters/answer/2571221?hl=en
Make sure that you do have proper 301 redirect from www to non-www (or the opposite), if you just did it, it will take time for MOZ tools to recrawl and understand that. If redirect is proper you should see in Open Site Explorer a message like this (when you enter the version, which is supposed to be redirected):
You entered the URL http://your-domain.com which redirects to http://www.your-domain.com/.
Because it's likely to have more accurate metrics, we're showing data for the redirected URL instead.
Click here to analyze http://your-domain.com instead?
So, first, make sure you have redirect, then wait until it's recrawled. (You can request recrawl in GMT).
Hi there 
First of all, Google's "link" search operator is not accurate. At all, actually. Read this post of Rand: https://moz.com/blog/google-link-command-busting-the-myths
Now about OSE. The way it works is the explanation of why your links can be still shown as well as the explanation for why new (or older existing) links are not being crawled yet.MOZ has it's own index, it does not contain all links on the internet. It's large, but still limited. So, in case of local listings, for example, MOZ doesn't crawl past few first couple hundreds results. Here is the topic on this matter: https://moz.com/community/q/local-listings-aren-t-being-crawled
Also, the OSE index is updated about once a month, so, if you deleted/disavowed backlinks after last update (date can be found here), it will not show changes for sure.
And lastly, Even if you disavow backlinks through Google, they are not being removed from it's existance, they are still on those forums/sites/etc, they are just not being counted by Google as link-juice passing entities.
So, to sum up: Google "link:" search operator is inaccurate, mozscape index is being updated every month, not every day, at the same time crawling only "surface" links, and disavow doesn't equal deletion.
Hope this helps!
Hello, my friend.
I have been asking myself the same question you ask here 
Well, i have done very scientific statistical research on a corner of newspaper while i had my morning coffee
And the results I got weren't surprising at all. The conversion rates on traditional advertising are way lower and much more expensive.
Now, as to the question you asked. The only(!) way to do this properly is to run only one(!) type of advertising at the time, for somewhat significant period of time (i'd say at least three months, plus another three months to track "snowball" results).
So, this is the way I'd do it (requires lots of time, money and several clients who are ready to be lab hamsters 
While running all those experiments, it'd make sense to use promo urls for non-internet marketing channels, e.g. domain.com/nameofradio - this way you can promote your website with certain landing page and track visits to it.
P.S. Of course, three months is short period of time to see good results. Extend it to 6-12 months if it's possible
Also, if you do it on 4 different clients, there is gonna larger margin of error. So, in perfect world you'd do it on the same client 
Hope this helps and good luck 
Alright, I see what you're talking about. I have tried to do similar search in Russian Google - it gave me the same suggestion.
https://www.google.nl/intl/en-NL/policies/technologies/cookies/
As far as I understand Google uses your browser locale and settings to offer different language, so, if you say all your settings are in dutch, then I'd look into your normal usage - do you mostly search in english? do you mostly browse english websites etc.
Additionally I noticed in past that even if you tell Chrome browser not to suggest to translate page, it still does it. There is even meme about that (can't find it now).
Hello, my friend.
I have noticed the same thing happening to our website and our clients' websites. As you said, we see lots of bad/spammy links to our competitors and they rank high (not always higher though). Well, I asked this question here: https://moz.com/community/q/spammy-backlinks-are-working
After reading all that + just using common sense + a bit of hope for intelligence of Google updates, I just didn't have enough guts to risk the rankings we achieved so far and reputation of the domain.
So, as it's said in responses in that discussion, if you're willing to see your website get messed up in case everything goes south, you're more than welcome. Just write a case study/research after that for curious minds like me 
Yes, I'm in US. That's why at the moment I'm like "What are you guys talking about?!" 
Hello, my friend.
It looks like those pages was not blanks at the time, it looks like these are session based shopping cart pages. Meaning that while you are on the website and add something to your shopping cart, you get that url with identificator of what exactly you have in your cart. After you leave the page, that id page either being saved (then it shouldn't be blank now) or get deleted if you make a purchase.
So, check how your functionality works, most likely there is some type of bug. If there are no any bugs and all those page supposed to exist and be blank (weird...), just canonicalize them all to domain.com/cart.
Hi there.
According to this article about MOZ Local - https://moz.com/blog/announcing-local-reporting
Listing Reach is our indirect representation of how far the data aggregators have spread your listings across the local ecosystem, based on the number of results returned for exact-match searches of your NAP.
so, as far as i understand it's literally just estimate of number of searches. So, if there was large decrease in overall searches or decrease in searches for your industry and your name didn't show up in results - you'd get change in "Listings Reach".
So, it's not necessarily problem with MOZ local. I say just wait couple weeks and see if there are more jumps.
As I said on your other post, it's most likely Magento's wierdness. Make sure you understand how that CMS works and so on.
Also, see where those pages are being linked from. Cause this is the way crawlers usually find those pages. When you find where it linked from, either delete or fix it.
Cheers!
Hi there.
Quote from GMT:
Note: When looking at the links to your site in Search Console, you may want to verify both the www and the non-www version of your domain in your Search Console account. To Google, these are entirely different sites. Take a look at the data for both sites. More information
I'd do it for both versions, just to be safe, even if you have proper redirects from one to another. If you don't have proper redirects - gotta do it right meow! 
Hope this helps.
Theory states that duplicated content reduces certain keywords’ position in Google.
Wrong. Google might omit duplicate results or ban sites practising it, but it doesn't lower rankings based on number of duplicates or something. Otherwise wikipedia or any aggregating websites like car dealers etc would be nowhere to be found.
It also says that a web who copy content will be penalized.
Semi-wrong. It will be penalized if it's spammy and overdoing it.
Watch this video of Matt Cutts on duplicate content - https://www.youtube.com/watch?v=mQZY7EmjbMA
So, my understanding is that there is no 100% working way of getting down scrapers, because some of them are actually "good" scrapers. Like Facebook! - the biggest scraper in the world.
So, to beat them in rankings, just make sure that you are an authority in your industry, have awesome backlink profile and all aspects of SEO are properly implemented. And yes, sometimes those penalization tools can help.
Hello, Olga.
If you want to simply remove it (or at least tell Google bots that it's removed), use "410 Gone" status, not 404. For user experience, I'd redirected those subdomains to index page of domain, or, if you have time and resources, create a landing page, saying something like "Sorry for inconvenience, the content is not there anymore, use search to find whatever you need ", add a search bar of main domain content, viola!
Hope this helps.
You can use full keyphrase in page tilte/content/h1 tags and so on without keyword stuffing.
There are lots of good posts and whiteboard fridays about that. Here is one of them: https://moz.com/blog/keyword-targeting-density-and-cannibalization-whiteboard-friday
Hello, fellow MOZers.
I just read this article of this guy (not myself, I promise
) I never heard before:
http://stream-seo.com/churn-burn-backlinks-case-study/
According to him SUPER spam is actually helping, and, according to him again, google punguin didn't knock it down either.
It looks and sounds all very shady and salesy for all those tools he is mentioning (actually, not just mentioning, but linking to). At the same it looks impressive if it's true.
My thoughts are that it's falsified/buttered article, otherwise, if everything he says is true, we would see direct correlation between ranking position and number of backlinks (linking domains).
I've been struggling to resist the temptation to try some spammy techniques to get backlinks to see what happens, but, at the same time, my brain says not to be an idiot.
Howdy, my friend.
This does sound a little complicated.
We have a .com website which has a little domain authority and is growing steadily. We are a UK business (but have a US office which we will be adapting too soon)
We are ranking better within google.com than we do on google.co.uk probably down to our TLD.
From first line it sounds like you have one domain (.com), second line tells me that you got two domains (.co.uk and .com). So, which is it? well, it wouldn't affect my answer, I guess, I'll cover both.
So, let's go step-by-step:
Is it a wise idea to 301 our .com to .co.uk for en-gb enquiries only? - No!Because i can be in any part of the world and i can set my browser locale header to whatever i want. Or it can happen automatically due to whatever circumstances. So, let's say I'm travelling from UK to US. My browser is set to en-gb locale, I do my search for your company in US, I get redirected to UK website, even though I want to fing US one - no good. Bad UX 
Is there any evidence that this will help improve our position? will all the link juice passed from 301s go to our .co.uk only if we are still applying the use of .com in the US? - I combine these two into one answer. Link juice can be passed if crawlers can differentiate locale on a given URL. Here is how they can do it:
https://support.google.com/webmasters/answer/6144055?hl=en
So, let's say there is a website with link to yours. It's one link. You can't set two "hreflang"s, or "rel"s, or write it in two languages at the same time on the same URL. So, basically you can tell bots to consider that link as a "juice-passer" to only one domain.
Now, is there a good reason for you guys to have two different domains? If it has similar information, style etc., you can combine them into one (use subdomains or subfolders - Matt Cutts video) and pretty much cut efforts in half with double the return 
If there is a good reason for you guys to have completely different domains, I would concentrate for building/working/structuring .uk website for UK and .com website for US. It means that you'll have to do two backlink profiles building, two different technical and "another one" (forgot the word
) SEOs and so on.
Hope this helps.
Anybody have recommendations on a mini projector for business presentations? One that could be connected via a laptop or tablet.
Thanks for any feedback!
5mil+ links?! Wow!
What's their spam score? I'm surprised they are not blocked or something 
To answer your question - what does common sense tells you? The job of google and google bots is pretty much based on common sense. So, duplicate content website, ridiculous amount of links, no referral traffic - all these are obvious signals to run, Forrest, run!
Hi.
It's not happening to me. Are you the only one who it happens to? Have you tried doing that from different machines?