I have two questions/queries:
Posts made by James77
-
RE: Suggestion - Should OSE include "citation links" within its index?
Thanks Miranda, that would be great.
I really hope its added - this could be a very positive differentiator between Moz and other link research tools out there. Combining this citation link finder in with the 'Just Discovered' links - will in my view make a killer feature well above the competition.
James
-
RE: Moz's official stance on Subdomain vs Subfolder - does it need updating?
Thanks Rand - that's great information.
When you talk about rankings rising, did you see them rise for the KW's associated with http://moz.com/beginners-guide-to-seo or are we talking about rankings for other Moz pages ? - IE did adding http://moz.com/beginners-guide-to-seo contribute to a rise in rankings across the whole domain or just that subfolder.
I hope you consider making this into one of your WBF's / Posts, as I think it would be fascinating to see the "What you did", "What were the results" etc, and also get feedback from what others have experienced.
Many thanks
-
Moz's official stance on Subdomain vs Subfolder - does it need updating?
Hi,
I am drawing your attention to Moz's Domain basics here: http://moz.com/learn/seo/domain
It reads:
"Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website)." I am wondering if this is still Moz's current recommendation on the subfolders vs subdomains debate, given that the above (sort of) implies that SE's may not combine ranking factors to the domain as a whole if subdomains are used - which (sort of) contradicts Matt Cutts last video on the matter ( http://www.youtube.com/watch?v=_MswMYk05tk ) which implies that this is not the case and there is so little difference that their recommendation is to use whatever is easiest. It would also seem to me that if you were looking through the eyes of Google, it would be silly to treat them differently if there were no difference at all other than subdomain vs subfolder as one of the main reasons a user would use a sud-domain is a technical on for which it would not make sense for Google to treat differently in terms of its algorithm.
I notice that in terms of Moz, while most of the site uses subfolders, you do have http://devblog.moz.com/ - and I was wondering if this is due to a technical reason or conscious decision, as it would seem to me that the content within this section is indeed linkworthy (as it has external links pointing to it from external sources), therefore it would seem to not be following the initial advice that is posted in Moz's basics on domains. Therefore I am assuming it is due to a technical reason - or that Moz's adive is out of date with current Moz thinking, and is indeed in line with Matt C in that it doesn't matter.
Cheers
-
RE: Suggestion - How to improve OSE metrics for DA & PA
Cheers Pete.
I totally understand the data dependency. One thing you could do, which would not require data dependency (long term), and also help with the spam detection your building is to take a single snapshot of "Ranking" - then use this as a data set to pattern match spam sites. EG if you managed to pull say 100,000's of ranking scores (say traffic scores from SEMRush), then match that with Moz's current scoring on that domain, then bucket the sites into groups that have higher or lower ranking scores than DA would predict, then try and reverse engineer the link or other patterns Moz use which are common to those buckets.
-
Suggestion - How to improve OSE metrics for DA & PA
I am sure everyone is aware at Moz, that although the Moz link metrics ( primarily I am talking about DA & PA) are good, there is a lot of room for improvement, and that there are a lot of areas where the metric values given to some types of site are well out of whack with what their "real" values should be.
Some examples
www.somuch.com (Link Directory) - DA 72
www.articlesbase.com (Article Directory) - DA 89
www.ezinearticles.com (Article Directory) - DA 91I'm sure everyone would agree that links from these domains are not as powerful (if of any value at all), as their DA would suggest, and therefore by definition of how moz metrics work, the sites these have links from such sites are also inflated - thus they throw the whole link graph out of whack.
I have 2 suggestions which could be used to singularly or in conjunction (and obviously with other factors that Moz use to calculate DA and PA) which could help move these values to what they should more realistically be.
1/. Incorporate rank values.
This is effectively using rank values to reverse engine what google (or other engines) as a "value" on a website. This could be achieved (if moz were not to build the data gathering system itself), by intergrating with a company that already provides this data - eg searchmetrics, semrush etc. As an example you would take a domian and pull in some rank values eg http://www.semrush.com/info/somuch.com?db=us - where you could use traffic, traffic price, traffic history as a metric as part of the overall Moz scoring alogrithm. As you can see from my example according to SEMRush the amount of traffic and traffic price is extreamly low for what you would expect of a website that has a DA of 72. Likewise you will find this for the other two sites and similarly to pretty much any other site you will test. This is essentially because your tapping into Googles own ranking factors, and thereby more inline with what real values (according to Google) are with respect to the quality of a website. Therefore if you were to incorporate these values, I believe you could improve the Moz metrics.2/. Social Sharing Value
Another strong indicator of quality the amount of social sharing of a document or website as a whole, and again you will find as with my examples, that pages on these sites have low social metrics in comparison to what you would normally associate with sites of these DA values. Obviously to do this you would need to pull social metrics of all the pages in your link DB. Or if this we to tech intense to achieve, again work with a partner such as searchmetrics, which provide "Total Social Interations" on a domain level basis. Divide this value by the number of Moz crawled pages and you would have a crude value of the overall average social scorability of a webpage on a given site.Obviously both the above, do have their flaws if you looked at them in complete isolation, however in combination they could provide a robust metric to use in any alogrithm, and in combination with current moz values used in the alogrithm I believe you could make big strides into improving overall Moz metrics.
-
Suggestion - Should OSE include "citation links" within its index?
This is really a suggestion (and debate to see if people agree with me), with regard to including "citation links" within Moz tools, by default, as just another type of link
NOTE: when I am talking about "citation links" I am talking about a link that is not wrapped in a link tag and is therefore non clickable, eg moz.com
Obviously Moz have released the mentions tool, which is great, and also FWE which is also great. However, it would seem to me that they are missing a trick in that "citation links" don't feature in the main link index at all. We know that Google as a minimum uses them as an indicator to crawl a page ( http://ignitevisibility.com/google-confirms-url-citations-can-help-pages-get-indexed/ ), and also that they don't pass page rank - HOWEVER, you would assume that google does use then as part of their alogrithm in some manner as they do nofollow links.
It would seem to me that a "Citation Link" could (possibly) be deemed more important than a no follow link in Googles alogrithm, as a "no follow" link is a clear indication by the site owner that they don't fully trust the link, but a citation link would neither indicate trust or non trust.
So - my request is to get "citation links" into the main link index (and the Just Discovered index for that matter).
Would others agree??
-
RE: Best SEO practice - Umbrella brand with several domains
I would if possible try and concentrate on a single brand/domain as opposed to multiple domain - on the assumption that an identical piece of content will rank higher on a single strong domain than it would on any of your less strong segmented domains.
With a single domain you also don't have the issue of worrying about how to cross market across domains (although as you point out many sites do so - Moz / OSE being one example) . Of course this also gets rid of the worry about whether google may see cross linking between multiple domains as a 'Link Network' - unlikely if it is just a few domians but if you are talking about 10's of different domains then this could well raise a flag in googles alogrithm.
It is still unclear as to how google treats multiple domains owned by the same companies, and on the same theme. Many websites have sub domains for their blogs and other sub-sections and the standard thought a few years ago was that you were better off moving those into folders. However, I think things have now changed such that if there are enough connecting signals it doesn't matter if its a sub domain or sub folder. When considering this, as a subdomain is technically a separate site, you would think that if similar connecting signals where to appear between 2 separate domains, then Google could (and probably should) treat them as they would sub-domains. However, whether there is some, all or no truth in this is likely to be a debate that will be speculation until Google confirms something along these lines.
-
RE: How long before your rankings improved after Penguin?
I think its very unlikely it takes 6 months for a disavow file to be processed - I have heard ranges from a couple of weeks to a couple months but no more.
If your talking about an algorithmic penalty (like penguin) then most likely you should see more gradual changes than if it were a manual penalty (which you would need to get removed via a re-inclusion request to see improvement).
When you mention 'go overboard' then that is a matter for you to decide what is a good and bad link, but in my opinion if you are under any penalty you need to be pretty thorough in removing all links that you think could be causing you issues.
-
RE: Good social media project manager tool
There are so many tools out there that it is quite mind boggling, however, I would also recommend HootSuite. Most tools have free trials, so its probably worth having a look at a few one you think may fit and test them out.
Have a look at the below post links which cover some of the best know tools with some objective summaries:
http://www.socialmediaexaminer.com/social-media-tools-used-by-pros-today/
http://dashburst.com/best-social-media-management-tools/ -
RE: How to NOT appear in Google results in other countries?
Do they cause you any other issue than effecting your bounce rate?
If not, then I would not try to stop them visiting your website, but just segment the bounce rates as you have done above.
The reason I say this is three fold:
1/. Trying to prevent SE's not displaying you to certain countries etc, could be fraught with problems and you could well end up damaging the SERP's in your key countries.2/. 'Country by IP' is not an exact science - and you may exclude some people who are located in your target countries.
3/. If you ever expand into the countries you want to currently block, would it not be nice to already have traffic from that market? What about customers who are from your target countries but searching while they are on holiday abroad?
Hope that helps
-
RE: Robots.txt and Magento
I assume this is a robots.txt that has been automatically created by Magento? - or has it been created by a developer?
I ran it through a tool and it showed 1 error and 10 warnings - so i would say you definitely need to do something about it.
The reason for all those disallows is to try and stop search engine indexing them (whether they would even find them to index them if they were not there is debatable).
What you could do is set up robots.txt as you have suggested and then stop the SE's indexing the directories or pages you don't want in appropriate webmaster tools.
I don't like displaying a lot of 'don't index' paths in the robots texts as it is pretty much telling any hacker or nasty spider where your weak points may be.
-
RE: Dramatic decline in rankings
Firstly - check in Google Webmaster Tools to see if you have got a manual penalty.
Secondly make sure you haven't done anything silly like blocking search engines on robots.txt or response headers.
Thirdly I would then do a full link audit of your site - use OSE, Ahrefs, Majestic and WMT. Go through the links and ask yourself if they were honestly and naturally placed, and if you think Google could have an issue with them - and if so try and get them removed.
-
RE: To all the PPC expert :
With regard to the above I think the tool you are looking for is http://www.google.co.uk/intl/en/adwordseditor/ - This is basically a desktop app that allows you to manage all your campaigns. For some reason Google doesn't promote it very much, so a lot of people don't know it exists. It will save you a lot of time when creating and managing large campaigns.
With regards to managing your budgets it really depends on what your trying to do, and Adwords is so flexible (and pretty confusing unless your an expert!) in this that it is hard to define a single answer. You can manage budgets on many different levels from overall campaign to individual Keywords etc.
The thing with adwords is that its an organic learning process and needs constant monitoring and adjustment. My advise would be to make sure you have daily budget caps in place, and then try putting max bids on you ad groups. Then as time goes by, monitor your ROI on these campaigns and adjust accordingly.
-
RE: Do 'Just Discovered' Links get added to the main link index?
Hi Sam,
That's great to know. IMO the ability to get these 'just discovered' links into the main index is a significant advantage to Moz over other tools. Given a just discovered link must first be tweeted, these links are (generally) more important than non tweeted links, and therefore of more importance to the overall link profile.
Cheers
-
RE: Removing the clutter of site-wide links
ahrefs.com provides the separation of sitewides and nonsitewides as a standard headline metric, and this is a useful metric.
My question wasn't in regard to the effect of sitewide links (we all know that manipluated sitewides are a red rag to a bull - but can equally be totally natural and clean ), but the fact that they create an awful lot of noise in some of the reports, and it would be nice to be able to filter that noise out.
-
Removing the clutter of site-wide links
I have a multi-part question with regard to the moz link index and some presentation suggestions.
Firstly I would be interested to know how the link index treats site-wide links with regard to metrics such as DA, and PA. We all know that it is highly likely that SE's are unlikely to pass full link value across from sitewide links, and therefore it would make sense for Moz values to account for this as well - if they do not already.
One annoying thing that also relates to sitewides is that they tend to clutter the much of the information presentation in a few of the tools (you can't see wood for trees as it were). This is most prominent in the "Just Discovered" page - if you have a sitewides on a large site, you can often find that this screen is just totally filled with these links as they are found. It would be very useful to be able to filter these out, as they are of little interest - currently I can't see a way of filtering them out.
A further value where they create to much noise is the 'Total Links' value. Where sitewides are included in this value, the value actually becomes pretty meaningless as you can find that the majority of that value is sitewides. It would therefore be useful if there was another value for 'Total Links - Excluding Sitewides' where maybe value of 1 was just added to the count for a site wide
-
RE: Is it OK to Delete a Page and Move Content to a Another Page without 301 re-direct
yes, I would certainly recommend 301'ing the link - or better could you just simply overwrite the old page (A) with the new page you are intending to create (B) - unless of course B already exists?
If you struggling with managing the 301, then I would first check if there are any external backlinks going to page A. If there are, then I would certainly try and find a way to 301. If not, then it will not effect things to much as there is no external link equity going to the page to loose.
A further option if you are unable to 301, and copying the content from page A to page B is to rel=canonical A to B, which should pass the link equity across in a similar way to a 301.
-
Do 'Just Discovered' Links get added to the main link index?
Hi,
I was wondering if the 'Just Discovered' links get added to the main link crawl index?
It would seem to make sense for them to do so, as this would enable the enable the link index to be more up to date than it would otherwise be. Observing the link index it would seem that at the moment it does not do this and they are totally separate indexes (based on personal observation).
Thanks
-
RE: Doubt with no follow links: disavow or no action?
It really depend on what those 'no follow' links are. Generally they cannot hurt you, but if they are done in a manipulative way eg mass blog comments, and furthermore if they are on bad sites - then they can hurt you - see Matt Cutts video with more info on this: http://www.youtube.com/watch?v=QSEqypgIJME