It's very common for an internal page to be more popular than the homepage, and there is no known penalty or discrimination in such a case.
Best posts made by Cyrus-Shepard
-
RE: Question about optimising an inner pages apposed to the homepage
-
RE: Open Site Explorer csv report pending since 2 days!
Hi John,
Most reports are complete within a few minutes, sometimes a few hours. Occasionally a report gets hung up and it can take days. The engineering team is constantly working to improve this process, but I understand these hangups can be extremely frustrating when they happen.
As Harald suggested, feel free to contact the SEOmoz Help Team (help@seomoz.org) if you find yourself waiting an extra long time for a report. Often they can have an engineer look into the system and solve the problem.
-
RE: DAs dropped for many pages?
These other folks left some fantastic responses, so I'll only add that DA is a relativistic number generated each update. It's best use case is to compare yourself to other sites, so if both you and your competitor dropped a similar amount, that's actually not a bad thing.
-
RE: Competitive analysis of inbound links
Hi Dan,
The only difficulty is that OSE limits the number of links from each domain to 25 per report. This limit is meant to display a greater variety of linking domains. Without this limit, the reports could be filled with 10,000 low-quality, repeating sidebar links all from the auto-generated blog.
One option would be to use the SEOmoz API to find the data you're looking for, but building a tool just for this purpose might be a bit overkill for this application.
Here's a solution that should work easy. Check out Dr. Pete's post on Link Profiling with Open Site Explorer. He provides some tools for a deeper analysis of link distribution you should find insightful.
When your done with that, try out these Excel speadsheets from John Doherty. They look both at PA and DA of your backlinks, as well as the added bonus of anchor text distribution.
Hope this helps! Best of luck with your SEO.
-
RE: Google crawling different content--ever ok?
This is not the definition of cloaking and I wouldn't worry too much about any penalty.
That said, anytime you redirect googlebot to a different experience than users it's a situation you want to be very careful with, and in most situations avoid. Often this is solved by serving different experiences via javascript. Even though Google is pretty darn good at parsing javascript, they will often interpret the default version of a page as if the javascript is turned off.
Regardless, I'd keep an eye on search results, Google Webmaster Tools, cached versions of your site and make ample use of "Fetch and Render" in GWT to ensure Google interprets your site they way you think it should.
-
RE: Duplicate page content showing up with proper use of canonical tag
Hi Leigh,
The SEOmoz PRO platform is designed to detect canonicals, and disregard these kind of errors when proper canoncials are in place.
That said, there have been bugs before which has presented this from working correctly, but most of those have been fixed. If you are still seeing problems, I encourage you to contact the help team (help@seomoz.org) to make sure everything is working okay in your campaign, and to verify this is actually a bug and not something wrong with your canonical tags.
Best of luck!
-
RE: I would like to get rid of 300,000+ links, please
Hi Linda,
In the absence of any penalty or rankings dip, there's likely nothing to worry about as this is pretty common. If nothing else, even though Google reports those links in GWT it's very possible they discount them in every other way that counts.
On the other hand, if you are worried about it there should be no harm in proactively disavowing the domain. Lot's of agency folk that I know make this a regular practice to disavow any suspect links and 99.9% of the time, if done correctly, no harm will come to your site simply by filing a disavow file (unless you disavow links that are actually good, but that's another story)
-
RE: Question regarding the SEOmoz Report Card
First of all, you can run multiple reports for the same URL using different keywords. You can run as many keyword/url reports as you wish, but they will all return a separate grade. More information on running custom reports:
http://www.seomoz.org/help/on-page-reports
Obviously, there's a limit to how many keywords you can score an "A" with on the same page. If not, Wikipedia could just build a single page that ranked for every keyword in the world (not really, I'm exaggerating

Conceptually, Google treats "Miami SEO" and "SEO Miami" as very close. But there are mulitple ways to optimize the same page for both keywords.
- Vary the keyword in your URL
- Variations of the keyword in your h1, h2, h3 headers
- Variations of the keyword in your body text
- Variations of partial match anchor text in links pointing to the page
- Links from external websites that themselves are about "miami seo" etc.
Instead of creating a page for every possible variation (which the Panda algorythm was designed to target and punish) try creating in-depth content that explores "Miami SEO" completely. You'll rank better for long tail terms, and improve your relevance for the short tail terms as well.
As far as high importance factors, there has traditionally been a strong correlation between the keyword position at the front of the title tag and higher rankings. This correlation isn't as strong today, but still exist.
http://www.seomoz.org/learn-seo/title-tag
Hope this helps. Best of luck with your SEO!
-
RE: Nofollow Outbound Links on Listings from Travel Sites?
Good question.
On one hand, I'm a fan of linking out with, link equity. There's a good correlation with linking out and higher rankings (though I don't believe we've ever studied the difference between followed and nofollowed in this regard) I hate to see links "nofollowed" simply to protect against Google actions, but it is a reality of doing business.
To me, it comes down to how many of the sites are actual spam. "Low quality" is certainly different than spam. If it's a handful of sites out of thousands, I wouldn't worry about it too much. Generally, tourism websites are a much more trustworthy quality than sites in the gambling/adult/pharmaceutical verticals.
Now, on the other hand, if you do choose to nofollow the links, you probably won't see too many negative consequences.
In the end, I think you have to guage how bad the sites are that you're linking to, and make your judgement from there.
-
RE: Wildcards in Backlinks
Are you looking for domains that link to the root domain golffacility.com? If so, be sure to filter for "pages on this root domain" in the linking domains report. Here's the link:
...which shows over 600 linking domains.
Or individual links here:
Now, OSE will only show you up to 25 links from each domain. We've found that beyond this number, the value of each new link you find doesn't add much value. (often these are caused by sitewide links)
By comparison, if you search ahrefs for links to just the homepage, they list 14 links, consistent with OSE:
https://ahrefs.com/site-explorer/refdomains/exact/golffacility.com
-
RE: Does a non-canonical URL pass link juice?
Complex question
Caveat: I don't work for Google and the precise workings of the canonical element in Google's algorithm is mostly educated speculation.The answer is somewhere in-between yes and no. That's because the canonical element means that URL B is treated as URL A. In that sense it really shouldn't pass any direct link authority.
But(!) now let's complicate things. Let's point some links at URL B. (and not at URL A) In theory, those links are then canonicalized to URL A, and that equity passes to your site (yeah!)
So it's not a direct influence, but you can in theory gain link equity from canonicalized versions of URLs that point to your site.
-
RE: Site explorer Issue
Hi Cladio,
Mozscape doesn't include every link on the web, but it does usually include links that are likely to most influence your rankings. Links on large directory sites, or on pages beneath several layers of navigation, often don't get included.
It's similar to how Google may not crawl/index pages very well on DMOZ - it's a large important site, but there simply isn't enough link juice to spread through all the pages.
(that said, our engineers are working on updates so fewer important links get dropped from the index)
Here's how Moz Help Team manager Aaron Wheeler explained it to a customer:
"Linkscape focuses on a breadth-first approach, and thus we nearly always have content from the homepage of websites, externally linked-to pages and pages higher up in a site’s information hierarchy. However, deep pages that are buried beneath many layers of navigation are sometimes missed and it may be several index updates before we catch all of these.
If our crawlers or data sources are blocked from reaching those URLs, they may not be included in our index (though links that points to those pages will still be available). Finally, the URLs seen by Linkscape must be linked-to by other documents on the web or our index will not include them."
-
RE: Traffic has not recovered from https switch a year ago.
We've actually seen Google get harsh on category-type pages across a wide number of industries and sites. It's even happened here at Moz. If your HTTPS is implemented correctly (and sounds like you are reasonably certain it is) you might want to look to other areas.
I'd look at your category pages and make sure:
- Pagination is implemented correctly
- Canonical are in place, where appropriate
- If possible, each category should have it's own introductory text, i.e. https://moz.com/ugc/category/link-building
- Basically, do everything you can to treat your category pages like actual landing pages worthy of search traffic, including unique content, value, title tags, descriptions, etc.
-
RE: DA as a measure of link-building success/failure?
Moz data scientists are working on spam scores that will help in analysis, but it's not ready for public consumption yet.
One often used tool for detecting penalties is Searchmetrics Visibility charts. It's not hugely accurate, but can often give you a rough guess: http://suite.searchmetrics.com/en/research/domains/visibility
-
RE: Not necessary to have keywords in the page? Do you agree?
Sounds like your SEO consultant is taking a small fact and blowing it up to usefulness proportions.
Yes, while it is possible to rank for a given keyword without actually having the keyword on the page, the vast majority of the time the keyword - or a close variant - is found in several parts of the HTML. The most common place is the <title>tag, but other common locations include the body text, headers, alt image tags, meta descriptions and so on.</p> <p>Unless you have very good links pointing at your site that reference your keywords (either directly or possibly through co-citation) you face an uphill battle trying to rank for your given terms if you don't include the keyword in your content or other HTML elements.</p> <p>This is a highly studied concept. If you're interested in the raw data, you may want to check out SEOmoz's 2011 ranking factors:<a href="http://www.seomoz.org/article/search-ranking-factors#metrics-6"> http://www.seomoz.org/article/search-ranking-factors#metrics-6</a></p> <p>Or a more recent correlation study performed by the Open Algorythm. <a href="http://www.theopenalgorithm.com/correlation-data/on-page-factors/">http://www.theopenalgorithm.com/correlation-data/on-page-factors/</a></p> <p>Another area that may interest you is LDA, which stands for Latent Dirichlet Allocation. This refers to the relation of how certain keywords associated with one another are positively correlated with higher rankings. A company call Virante has created a couple of tools around this concept. You can find them <a href="http://ntopic.org/">here</a> and <a href="http://www.virante.org/seo-tools/lda-content-optimizer">here</a>.</p> <p>Hope this helps! Best of luck with your SEO.</p></title>
-
RE: ERROR: Too Many on-page links!
Hi There!
We moved "too many links" under "Site Information" so it no longer counts as an error. We still include it in your report just for your personal knowledge, but in most cases there is nothing you need to worry about.
Too many links can dilute link authority and may not present the best user (or robot) experience, but at least today search engines can generally crawl all of those.
Dr. Pete wrote a good post on the subject awhile back: http://moz.com/blog/how-many-links-is-too-many
Best of luck with your SEO!
-
RE: How effective are nofollow links today (2013) ?
This is a complex issue, but let me share with you:
- What we know to be true &
- What most consider best practices
1. Google has stated many times in the past that nofollow link pass neither PageRank nor anchor text.
2. That said, many SEOs believe nofollow links do play some roll in search engine rankings, although the relationship is elusive and difficult to define. The 2011 SEOmoz Ranking Factors did show a correlation between nofollow links and higher rankings (but now I'm getting off topic)
3. Most good SEOs believe that PageRank sculpting either doesn't work, or is generally such a low ROI activity that it isn't worth it. (However, not everyone holds this opinion)
4. Google is getting better at parsing javascript links (although we don't know what kind of link signals are passed through them) and there is even evidence that Google passes link signals through iframes, but again we can't quantify what signals/how much.
**5. Best Practices - **Generally, you don't want too many links on a page, and you want your important links in prominent places.
If you have too many links that you don't want followed, you should consider consolidating them into a single link. For example, you can put your Contact, Privacy, Info and FAQ into a single link/landing page that leads to a better experience for users and search engines.
See this article by Rand on link consolidation. It's also from 2009, but just as relevant today: http://www.seomoz.org/blog/link-consolidation-the-new-pagerank-sculpting
Hope this helps! Best of luck with your SEO.
-
RE: Websites First Crawl - Over 2 Hour Suggested Wait
Hi Louis,
You are right, the first starter crawl should be complete within the first couple of hours. One reason it might take longer than normal is if the SEOmoz crawler, rogerbot, has trouble crawling your site.
I took a look inside our system and it appears the website associated with your campaign doesn't have a robots.txt file. Because the SEOmoz crawler is extra polite about crawling, whenever it doesn't find a robots.txt file, this can prevent it from crawling at all.
To fix this, create a text file called "robots.txt" and with the following information in it, and place it at the root directory of your site.
User-agent: *
Disallow:Read more about robots.txt here: http://www.seomoz.org/learn-seo/robotstxt
Unfortunately, it may be too late to add this to fix your starter crawl. So here are some things you can try:
- After adding the robots.txt file, try creating another (duplicate) campaign and seeing if that works.
- Or wait until your next crawl
- Contact the help team at help@seomoz.org
Hope this helps! Best of luck with your SEO.
-
RE: How effective are nofollow links today (2013) ?
Er... sorry if I wasn't clear. IFrames may or may not pass PageRank, but I would hardly ever recommend using them if you want to control your SEO variables.
Best practices would be to make the majority of your links followed, and consolidate lessor important links.
-
RE: SearchDex for SEO consultation? Price feedback?
Hi Rick,
Unfortunately I don't have much experience with SearchDex, and I certainly don't want to comment on a company I don't know much about (other than the JCPenny thing)
But I will tell you how I might evaluate a company.
- I'll look at their social media profiles. Do they follow industry leaders on Twitter?
- Do they contribute to the SEO community (blogging, conferences, participation in forums like this)
- Do they have a decent blog?
- Who are there clients? Most, but not all, firms have a public list of clients you can check out. How do they rank?
- Do they guarantee rankings? This is usually a bad sign.
Our friend Rhea Drysdale recently wrote about this on her company blog
Excerpt...
- Request case studies and contacts from like industries/site size to ensure they can handle your work
- Testimonials (if these aren’t published, ask for them, it may simply be that the SEO company has NDAs that protect their clients — like Outspoken Media does!)
- Talk to their current clients
- Along the same lines, if they display client logos, call them
- Google them and check out their online reputation
- Look at how the site is written and identify warnings (e.g. big guarantees or language that sounds too good to be true)
- Check the site’s own rankings, community engagement, backlink count and domain authority, etc.
- Check industry visibility
- Ask for their methods
- Question their ethics for grey areas to see if they align with your needs
- Question their tool set (Does it sound outdated? If it’s proprietary, do they keep the data if you leave? Can they report on what you need?)
If you're generating a few hundred thousand a year in revenue, this is something worth investing in. I'd advise talking to at least 2-3 different vendors and getting proposals from all of them.
Finally, you may be familiar with the SEOmoz Recommended List, but it's always a good place to start.