Its likely because its attempting to find an exact match so if you have Moz tracking [Pelican] it won't see [Pelican
] as the same thing.
Posts made by MikeRoberts
-
RE: My title has a TM symbol and Moz says I don't have the keyword in my title
-
RE: Webmaster tools Hentry showing pages that don't exist
Without more information or a site to look through, I did a cursory search of Hentry issues that could be the cause of your problem and the potential fixes.
https://www.acceleratormarketing.com/trench-report/google-analytics-errors-and-structured-data/
-
RE: Moz Point Swag
I figured as much, which is fine. I can wait until the day I get the box. It would just be great to walk in and put down that Roger figurine and get my boss all jealous.
-
Moz Point Swag
So, I've been away from the SEO world for a few years but now I'm back in full swing and I noticed I have enough points for "A special MozPoints t-shirt and a Roger vinyl figurine" but had never gotten those because I assume it must have become an added swag during my hiatus... but I'd absolutely love to get hold of that as my boss is a big Roger fanboy and it would be hilarious to have that on my desk as a friendly mocking.
If I can't, I can't. But I figured it couldn't hurt to ask.
-
RE: 301 Redirect Question
I ran a crawl on screaming frog as well. I don't see a problem with the 301s. They mostly seem to be pointing the non-www page to the www version... assuming you want the WWW version ranking over the non, then everything is fine. As long as everything is pointing to the correct version of the page then you shouldn't have any issues.
-
RE: Duplicate content across a number of websites.
The problem, as stated by Logan and Don, is that if each of the 25 different locations are too similar then none of those are going to do well in the SERPs. You need to determine how much of each site is going to be too similar and/or duplicate content and consolidate that. One way to do that, as stated by Don, is a single site with local options.
Some achieve this by using geolocation or entering in postal codes & either choosing their local store or having site parameters alter product availability. The content is then restricted by the offerings at the visitor's local store instead of showing all available options from the overarching corporation. So the product pages still exist and are crawlable but some color options may be grayed out where they aren't available or "Out of Stock" warnings will appear where applicable.
One other option i've seen is using differing subdomains to offer up the same basic idea as geolocation/postal code but could help with local organic search. e.g. NewYork.Webstore.xyz vs. London.Webstore.xyz This would allow each location to essentially have its own mini-site that is on the company's main site (like a halfway point between one big single site and 25 duplicate content sites). Now with the single site altered by location data, you only need one version of a product page but you would need to write up some great localized landing pages for each individual store. For the subdomain idea, you'll want to canonicalize all the duplicates to a main version... so the page for NewYork.Webstore.xyz/ProductA/ and London.Webstore.xyz/ProductA/ would have rel="canonical" pointing at your main site's page Webstore.xyz/ProductA/ so authority is passed to the root domain and you don't get penalized for duplicate content.
-
RE: Recommendations for the length of h1 tags and how much does it matter. What is the major disadvantage if the h1 tags are slightly longer.
From my understanding, there is technically no limit to the length of an H1 tag. Rule of thumb for me was always to keep it short and to the point. You don't want to water down any relevancy gained from the h1 by shoving too much into it.
-
RE: Can I configure Moz to ignore certain query params the same way I can in GWT?
Best way I've found to handle this is setting up your site so that all versions of a page with parameters features a rel="canonical" tag pointing at the version of itself without parameters.
Edit: As necessary, depending on the nature of the parameter and how heavily the parameter affects the contents of the page on your site.
-
RE: Why Can't I Change My Campaign to Track All Subdomains?
Whether or not to exclude subdomains can only be set up when creating the campaign. After that, you're locked out those choices. As far as I know, you'll need to create a new campaign for your root domain and then leave "exclude subdomains" unchecked.
-
RE: Tools to check IP address of websites
I generally go to Who.is for things like that.
-
RE: Weird 404 URL Problem - domain name being placed at end of urls
I had this problem in Wordpress about a year ago. In my case it was caused by links being entered into posts getting turned into relative links instead of being absolute links. Somehow this was causing the links to append the domain name to the end of the url. In our case it turned out to be an incompatibility between plugins. Have you tested all your plugins to see if any of them are interfering and causing this issue?
-
RE: Facebook Reach on Post Just Spiked!
Edgerank doesn't exist anymore. Its much more complex now and without a catchy name. (I still catch myself calling it Edgerank when trying to explain Facebook feeds to people though) http://marketingland.com/edgerank-is-dead-facebooks-news-feed-algorithm-now-has-close-to-100k-weight-factors-55908
Check the post metrics in your Facebook insights for a deeper understanding of that post. Might help you glean more ideas as to what specifically was different about this post and maybe it can be replicated in future posts. Much of Facebook is trying to determine when to post those things that should be seen right away, when to post the interesting stuff that can sit all day, when to tag who & how, varying of images & links & shares, and determining what your community appreciates so you can deliver more of that to them.
-
RE: When does Moz update campaign data with new timeframe ?
Should be every week on or around the day of the week that the campaign was created, if I'm not mistaken. I get my updates every Friday afternoon except for one campaign that updates on Tuesdays.
Edit: And because I forgot that function was added, when you're in your account if there's a tab on the right side of the screen that says FAQ... you can click that and then click a whole bunch of different things to get those answers directly. Trying it out now when I click on the week listing it pops up "Q: How often will my site be crawled? A: We crawl your site once a week, usually on the same day each week. For example, if you set up a campaign on Monday, the first crawl will be done on Tuesday and all of your subsequent crawls should complete on Tuesday going forward."
-
RE: Is it worth pursing PR and guest posting just for links?
Pursuing any avenue just for links is the wrong way to go about it. Press Release links have gotten hit bad lately because of their misuse and overuse... but if you have some news on your business that is actually PR worthy then shopping it around to respectable sites and getting your news out there to the right people can increase your qualified traffic. Same goes for guest posting. Its been hit bad lately but its not about the links per se. Getting your name out there, branding, sharing useful information or something humorous or poignant can help people learn who you are, increase your qualified traffic, etc. etc. and you don't need a followed link to reap the benefits. It also wouldn't hurt to look into social for branding and community purposes. And a product/tool/widget/infographic can also be a great way to gain links and/or spread via word of mouth/social mentions. But be sure not to go embedding any hidden links in sharable widgets or you'll get bitch slapped by Google as well. As Andy put it, creativity is the key here. There are so many ways of earning links, getting shared and being seen online that there practically is no limit to what is possible.
-
RE: Number of indexed pages dropped dramatically
It is unlikely that having links from MyBlogGuest would cause a drop in indexed pages like that. Where are you seeing this drop in indexed pages? Is it being reported in Moz or Google Webmaster tools? Also, do you have Google Analytics set up for your site to check other metrics? A large drop in indexed pages does not necessarily mean something wrong (canonical tags, cleaning up duplicate content, reporting errors, noindex tags, etc. can all cause a drop in indexed pages).
-
RE: Number of indexed pages dropped dramatically
Have you seen any corresponding drops in traffic? Have you made any recent changes? Redirects, canonicals, site remodel, link restructuring, changed hosting, updated your CMS, etc. etc. It'll be a bit unlikely someone will come up with the correct reason without more information.
-
RE: Matt Cutts says 404 unavailable products on the 'average' ecommerce site.
Personally I prefer leaving the unavailable products (ones that will never come back) up & accessible for a set amount of time, placing a notice & link on the page to the most relevant available product or related category page, placing a canonical on the unavailable product page to that related product/category page and then after a few months redirecting the unavailable product to the related page.
-
RE: Www and non www showing in Moz Reports despite being correct in GWT
All instances of duplicate URLs with/without WWW and with/without the trailing slash have all been redirected to their correct version? If they exist and aren't redirected then they are essentially duplicates of each other. Just because they aren't in site searches on Google doesn't mean the pages don't exist improperly, it just means that Google has chosen not to index those versions. If the pages haven't been redirected to their proper URLs, you may want to look into doing that in order to consolidate any potentially lost link equity.
-
RE: How come www.ifundinternational.com beat us despite that most links seems VERY shady?
If they are, in fact, breaking guidelines that Google should be penalizing them for then there is always the spam reporting tool. https://www.google.com/webmasters/tools/spamreport (you need to log in/be logged in to webmaster tools in order to use it).