Yup, deleted.
Posts made by eatyourveggies
-
RE: Have long tail searches increased or decreased?
I've never come across any stats, but logic (well, my logic at least) and from everything I hear from anyone is that if anything, it's increasing.
People are getting better and better at searching for specific things online.
But... it's ultimately up to Google. Google may decide that your search query, "where to get a hawaiin pizza 24 hours a day in New York" is very specific but it's going to focus on shorter phrases like "24 hour pizza" in the search results.
-
RE: Google webmaster tools says access denied for 77 urls
In Webmaster Tools, you can "fetch as google bot" meaning you can enter one of those 77 URLs, and see what the Google "bot" sees when going to that URL.
You can also use:
http://www.dnsqueries.com/en/googlebot_simulator.php
For the URL: http://www.in2town.co.uk/Entertainment-Magazine
the Google Bot Simulator says:
HTTP CODE = HTTP/1.1 301 Moved Permanently
Location = http://www.in2town.co.uk/Showbiz-Gossip
and for: http://www.in2town.co.uk/Weight-Loss-Hypnotherapy-helped-woman-lose-3-stone
HTTP CODE = HTTP/1.1 301 Moved Permanently
Location = http://www.in2town.co.uk/Weight-Loss-Hypnotherapy
Interestingly, both the NEW URLs work fine although http://www.in2town.co.uk/Weight-Loss-Hypnotherapy doesn't look too good (at least in my web browser) but that's another issue.
You have a fairly complex .htaccess file (hint: I looked up your OLD .htaccess file - you should delete old htaccess files or something so people can't access them via a web browser), so I'm guessing the problem will be within your .htaccess file.
If possible, put a plain and simple .htaccess file on, test it with Google Webmaster Tools and see if the error still persists.
Adam
-
Open Site Explorer and link numbers
I know this question has been asked many times in this forum but I still can't work it out.
Why does this link:
Which is showing all links, external, to pages "on this sub domain" show 1,935 external links but this link:
which is exactly the same but this time shoing followed + 301 links, says "showing 1 - 50 external links) but won't show the total links (and I know the mouse-over on the question mark says it's won't show the total links, but I don't understand why it can't show the total links when it could show the total links when I requested to see "all links" instead of just "followed+301" links.)
but it actually lists 700 links (14 pages, 50 results each page). I know the link list is limited to 25 links per domain but then it means you can NEVER know the total link count unless you download the full report.
This makes using OSE to know numbers of links (internal, external, or otherwise) impossible.
And if anyone uses the API, why the API (external+follow) returns 1,451 links?
I'm sure it's an ongoing issue with people trying to get their head around all of this and I've never really been able to.
Any insight would be much appreciated!
-
RE: Definition of a backlink
For me, when I hire someon to to "backlinking", it's about links from external websites pointing to the client's site.
I prefer the term "link building" as I think it describes more accurately what I want done.
The link profile on the actual client's site (internal links) is something I usually handle myself. It's more "on site" / "on page" optimisation (SEO) as far as I'm concerned.
So I think "backlinking" is external link building and then there's internal link building or on-page SEO.
Not sure why you're asking but if it's because of some confusion between you and a contractor or you and a client, there's no dictionary definition and so clarification is always required when using these terms.
-
RE: Social Signals and SEO
I'm actually not entirely sure how much access Google has to this kind of information. Over the last few years my opinion is there's been ongoing debate about exactly what Google can see in terms of Facebook data but it's clear that Facebook (social) signals do seem to have a correlation to rankings.
Just to clarify, people can't "like" a website (well, they can like the home page, but that's just the home PAGE, not the web SITE), so you have can have a "like" button on every page of your website.
You can also encourage people to like your facebook page.
And as a side, "like" and "recommend" are the same thing. "Share" takes a little more effort from the user and I don't think many people use "share" any more, not too sure.
But why not have both buttons? - a like button on all key pages of your website and also encourage people to like your facebook page.
Someone liking your web page means it'll appear in their news feed, their friends will see it, and then that's it. Liking a facebook page means you have that person seeing all activity on your facebook page.
I know you're asking about SEO / Google / Social Signals but I thought it was worth clarifying the above as I'm sure while you want better rankings, you also want more traffic to both your site and your facebook page.
You can associate your website with your facebook page and using Open Graph, post updates to that page and they'll appear on the feeds of anyone who has liked your facebook page.
As for which is better? I'd like to see if someone actually has an answer to that. I haven't been able to find specific information on exactly what Google sees, what it indexes, and what social signals it takes into account.
Adam
-
RE: Has anyone seen this before? One domain dominates the entire first page!
If it ever happens again, rush to you client and demand a bonus... before Google fixes it!

-
RE: Competitors traffic
I love SEMRush for this. Give it a keyword and it'll show you the top organic and paid listings. Give it a URL and it'll give you organic and paid keywords for that URL.
SEMRush (and all tools) can only give an estimate as to the amount of traffic. These tools base these calculations by looking at the position of the competitor in the SERP, the search volume per month of that keyword, and the likelihood of someone clicking on that result.
If you know that x% of people click on the top organic result and you know how many people are searching per month for that keyword, then you can get a pretty good idea of what the traffic is going to be like for that URL.
As for which keywords are converting... you'll need Google Adwords / Analytics for that. You could ask your competitors for access which would be funny, but obviously refused!
But keep in mind, if you trying to find the "top" keywords, tools like SEMRush only report the keywords that are in use / working for a given competitor. It doesn't mean there aren't other keywords you should look into - keywords that your competitors aren't ranking for at all. In fact, that's the goal... find the keywords your competitors don't know about (and therefore tools like SEMRush don't know about).
Adam
-
RE: Does Open Site Explorer show juice passing links
Hi Mark,
So you're saying in the API "juice passing links" is the equivalent to what OSE calls "followed + 301" links ?
-
Does Open Site Explorer show juice passing links
I'm a little confused between all the link types.
Internal / External - easy
Linking Root Domain - easy
Followed / No followed - easy
But then there's talk about "juice passing links" and I can't quite get how this is defined, and why it's something you can get from the API, but not from Open Site Explorer... or can you get that info from OSE?
-
RE: Domain / Page Authority - logarithmic
I received a response in the developers forum which basically said the input data is scaled logarithmically rather than the output being linear which is then scaled logarithmically.
This means it's impossible to answer the question.
However I have asked (and repeated the question here for the sake of anyonewho's interested)...
What is the distribution of DA values across all values? It would be nice to know that "the median DA across all sites in our database is x." That would at least put the numbers in some perspective - and it's perspective I'm trying to get.
Can you also confirm if the "keyword difficulty" is also calculated with logarithmic inputs? And what's the median keyword difficulty?
-
RE: Domain / Page Authority - logarithmic
No, I'm not seeking something deeper at all.
I don't care how they work out DA / PA.
The result is between 0 and 100, but they say it's logarithmic.
So is it log(x), or log(x+3), or log2(x)... ? ?
-
RE: API and bitflag question
Yes I think my email was playing up as I missed a few emails in recent days.
Why is this quesiton marked as answered?
How did that happen?
-
RE: Domain / Page Authority - logarithmic
I absolutely agree to re-invent the wheel is inefficient when people like SEOmoz have thousands of man-hours in developing great metrics... but I use the SEOmoz metrics along side other metrics for a few reasons:
-
To confirm the validity of metrics between providers and my own research
-
To customise the kinds of reports I give clients. For example, sometimes a link profile report is more relevant for a customer than a domain authority report. Sometimes both are relevant, etc.
It just sucks when you have to put caveats on data such as saying that the SEOmoz authority metric is logarithmic but to an unknown logarithmic curve.
I think SEOmoz should publish the logarithmic calculation. I'm not asking for their intellectual property on how they calculate authority or keyword difficulty, etc... I just want to know the logarithmic calculation. Otherwise I'm left asking, "what does 30 actually mean?" In addition, is the keyword difficulty logarithmic? SEOmoz doesn't say.
Adam
-
-
RE: API and bitflag question
Hi Ryan,
Thanks but I'm actually posting here because it's been several days and no response from my ticket.
So not sure what to do

-
RE: Is there a tool for measuring content freshness?
Many sites will have "Modified Since Http Header" enabled. If so, then when looking at a page you want to check, type this into your address bar:
javascript:alert(document.lastModified)
But that won't work on all sites as some may disallow javascript execution in that manner. But also if it's a dynamic site and the web owner hasn't configured their Modified Since Http Header properly, then you'll get incorrect dates anyway.
You can see if a website has the Modified Since Http Header enabled:
http://www.hscripts.com/tools/if-modified-since/index.php
You can use the internet archive to look at previous versions of a website (unless they've disallowed this in the .htaccess):
A cool tool I use to alert me to when a competitor (or any website) content is updated is:
http://www.changedetection.com
Enjoy
-
RE: Cross links between sites
What do you mean by "shut down" ?
If you can navigate the site then so can Google. Changing your home page to "we're no longer here" won't help either as Google can still access the pages "behind" your home page because it already knows they exist so it'll go directly to them when crawling.
You either need to delete the pages entirely, or delete the entire website, edit your DNS information so it no longer resolves your domain name... something that permanently "shuts down" and removes the site.
-
Domain / Page Authority - logarithmic
SEOmoz says their Domain / Page Authority is logarithmic, meaning that lower rankings are easier to get, higher rankings harder to get.
Makes sense.
But does anyone know what logarithmic equation they use? I'm using the domain and page authority as one metric in amongst other metrics in my keyword analysis. I can't have some metrics linear, others exponential and the SEOmoz one logarithmic.
-
API and bitflag question
I've googled and read and still confused on this.
Regarding the link metric bit flags:
http://apiwiki.seomoz.org/link-metrics
I understand everything about how the API works and have it all working, except for this bit flag thing... And there's no example on this page of how to use the bit flags in an API request so it's hard to work it out.
My current API call is:
"http://lsapi.seomoz.com/linkscape/" & $db & "/" & $objectURL & "?
SourceCols=" & $SourceCols &
"&TargetCols=" & $TargetCols &
"&Scope=" & $Scope &
"&Sort=page_authority" &
"&Filter=" & $Filter &
"&AccessID=" & $accessID &
"&Expires=" & $expires &
"&Signature=" & $urlSafeSignature
There are 4 Link Metric Bit Flags: 2, 4, 8, and 16. How do I use those in the above URL
Do I use those in SourceCols? And there's about 15 Link Flag Definitions (ranging from 1 to 65536). So I use those in TargetCols?
I've tested different variations and I get different responses but the responses aren't documented either.
For example, if I get "luutrp":5.432777943351583,"luutrr":1.415861867916131e-14" in a response... what IS luutrp or luutrr.
These acronyms are not outlined anywhere that I can find.
This link:
http://www.seomoz.org/ugc/the-busy-developers-guide-to-seomoz-bit-flags
outlines an example but is using Cols (not SourceCols or TargetCols) so either this page is outdated or I'm missing something. Is there also a "Cols" parameter?
Basically, I just want: - given a single URL, show me all links (page to page) pointing to that URL, and give me the same information that's displayed in Open Site Explorer (page authority, anchor text, title, etc, etc) but also give me some additional information such as, "is it on the same C block?"
What's an example URL for the above request?
Really hoping someone can shed some light on this under-documented API.
-
RE: How to un-answer a question?
Ah so people get points?! I'll have to research what they do but don't worry Dana, the points you shall keep!
But yes, a good feature would be to be able to set a question to unanswered.