Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Thanks guys. I knew it lost some link juice, but it seems like its definitely worth the extra effort to go back and update the old links. Thanks for the input! -Dave

    Link Building | | DavidCurrier
    0

  • It definitely does not influence search engine results in any way whether you use www or not. It is purely a matter of preference. One consideration you may wish to factor into your decisions is links. A shorter URL allows for easier link sharing. If you do not use www, then you are saving 4 characters (www.) which makes your links smaller. But...most software will recognize any word beginning with "www." as a link and convert it into a hyperlink. So www.mydomain.com would be a hyperlink, but mydomain.com would not be a link. You would need to use http:// in addition to mydomain.com to make the hyperlink. Then again, some software still doesn't convert based on the "www' so you would need to entire http://www.mydomain.com. I probably crossed the line into giving too much information. We live in a world of tweets and friendly URLs so I thought it was worth mentioning.

    Technical SEO Issues | | RyanKent
    0

  • Thanks Ryan I will check it out...

    Social Media | | GrouchyKids
    0

  • Exactly. And so, would it not be greatly beneficial knowledge to all of us to know if and when a limit is reached where this strategy is no longer effective? For example, there are many PR8 sites with literally hundreds of PR6 pages that allow dofollow commenting. We can alter anchor text and the deeplink to gain links from these PR6 pages. The question is when does this strategy become ineffective? Let's say our site has 100k pages. Should we spend our time getting a link from every available PR6 page from the same domain? Or is there a diminishing value? Having some sort of a study that's tried and proven to show if a persistent benefit exists, and when it wears off, would be invaluable to practical SEO, and the results of a study such as this are highly unlikely to change within a year. Surely you'd like to see something like this too? I do understand the need to keep SEO in-line with Matt Cutt's objectives, however the reality is that Matt Cutts objectives and what works are two different things. There would be no such thing as off-site SEO at all if Google worked the way it meant to. The thing is, is that it doesn't, and that is why off-site SEO exists. Instead of people giving hogwash answers, we should be demanding these sorts of useful studies. That is just my opinion anyway.

    Technical SEO Issues | | stevenheron
    0

  • could be they were swamped with other work - their plates have been pretty full lately.  (Not an excuse or justification, just a possibility).  I've just send a private message to someone to see if they can jump on it for you.

    Moz Tools | | AlanBleiweiss
    0
  • This question is deleted!

    0

  • Thanks for the great feedback in the P.S. - we really appreciate it! Did you know we have a feature request forum? It's a great place to share your ideas. Other people can vote on them and it will help us determine priorities. You should check it out: http://seomoz.zendesk.com/forums Thanks! Keri

    Link Building | | KeriMorgret
    0

  • The warning for too many nofollowed links is there because these links affect your SEO. Your page offers SEO benefits to your site. If you create a "hello world" page, that page will inherit some value from being part of your domain. It will receive further value from links to the page, social sharing of the page (tweets, etc) and so forth. The value of the page can be passed along to other pages, both inside and outside of your site, through links. The nofollowed links devalue any followed links on your page. If you want to use your blog to pass more juice to other pages within your site, I would recommend a comment tool that not only tags the links with "nofollow" but instead breaks the links so they appear as text, not links. To answer your question directly, YES, I think too many nofollow outbound links are a problem that should be fixed. If you don't believe these pages offer value to your site, you can ignore the warning. I think doing such is a missed opportunity.

    Intermediate & Advanced SEO | | RyanKent
    0

  • I've done as you suggested. I came across the issue after running a report that showed 40 links with 302 status-all turned out to be trackbacks. While the above advice takes care of the future, is there any harm in having these existing 302 trackbacks, or any advantage in making them 301s? Thank you, Dan

    Web Design | | YesThatDanGreen
    0

  • That's great to hear Sha. Sometimes it's not so easy to figure out the cause, sometimes it's just knowing where to look. And yes, a 1200 to 1 link to root ratio isn't very healthy at all

    Moz Pro | | AlanBleiweiss
    0

  • These are all great responses. In the coming months, we will be integrating the following URLS steripen.com (custom cms), community.steripen.com (Wordpress blog) and buysteripen.com (Magento store) into the core domain of steripen.com. Unfortunately we can't fold our GetSatisfaction implemention under the domain as well. will have to keep subdomain for that. Not looking forward to the potential effects of this but all this feedback is greatly appreciated.

    Social Media | | Timmmmy
    0

  • Rel Author is not for developers.  It's for the author of the content.

    Web Design | | AlanBleiweiss
    0

  • Ryan's right - it's because the page isn't in our index yet. You can verify this at www.opensiteexplorer.org. Sorry about that! Most new sites and links will be indexed by our spiders and available in Linkscape and Open Site Explorer within 60 days, but some take even longer for a plethora of reasons, including crawl-ability of sites, the amount of inbound links to them, and the depth of pages in subdirectories. Just so you know, here's how we do our index: we take the last index, take the 10 billion URLs with the highest mozrank (with a fixed limit on some of the larger domains), and start crawling from the top-down until we've crawled 40,000,000,000 pages (which is about 1/4 of the amount in Google's index). Therefore, if the site is not linked to by one of these seed URLs (or one of the URLs linked to by them in the next update) then it won't show up in our index We update our Linkscape Index every 3 to 5 weeks. Crawling the whole internet to look for links takes 2-3 weeks. And then we've got 1-2 weeks of processing to do on those links to determine which are the most important links etc. You can see a schedule of how often we update, and planned updates here: http://seomoz.zendesk.com/entries/345964-linkscape-update-schedule Linkscape focuses on a breadth-first approach, and thus we nearly always have content from the homepage of websites, externally linked-to pages and pages higher up in a site's information hierarchy. However, deep pages that are buried beneath many layers of navigation are sometimes missed and it may be several index updates before we catch all of these. If our crawlers or data sources are blocked from reaching those URLs, they may not be included in our index (though links that points to those pages will still be available). Finally, the URLs seen by Linkscape must be linked-to by other documents on the web or our index will not include them. For now, the best thing you can do to help your domain become indexed is to work on link building for links from sites with high mozrank. I hope this helps! This is more of a customer service question than an SEO question, so in the future, if you have a question about our site or tools, please email it to help@seomoz.org to get faster service. Thank you David! Best of luck.

    Technical SEO Issues | | AaronWheeler
    1

  • Here's the problem: the client's domain name sucks. It is short, but combines an acronym with one of the words in its long-version name. It uses the British spelling version of the long name fragment, even though most Canadians now use American spelling. And it is a .ca, rather than a dot.com I would be using "colorful" language too! But I would like some additional validation before proceeding. More colorful language.  - how much link juice might we lose? I've seen the figure of 10% bandied around. Is it accurate? Nobody knows for sure.  Not an awful lot. - might we see a temporary dip in results? If so, how long would it last? You might have unstable SERPs for a few days to a couple of weeks.  It's possible that your rankings could drop a place or two. But most of the time there is very little damage if you do the switch properly. - what questions did I forget to ask? What additional info do you need to offer informed advice Your question seems to be... "Should I do this?"    If this was my site the answer would be yes. In the past year we have done two domain moves.  One was positive results all around, great results.   The other was up everywhere except the trophy KW which we slipped from #1,2,3 to #2,3,4 but are working to get it back.  Sales are still very strong.

    Intermediate & Advanced SEO | | EGOL
    0

  • My view on tags is that they are helpful for humans to find information on a specific topic quickly. Being that they are more beneficial to human visitors than search engines, I have always blocked tag pages from being indexed via robots.txt. I also block categories and feeds from being crawled as well. This may be overkill, but it has worked well for me to avoid duplicate content issues on various blogs that I help manage.

    Content & Blogging | | dmoore
    0

  • Ive never seen any benefit proven and certainly wouldn't bother. Here's a short video from Matt Cutts answering just this question

    On-Page / Site Optimization | | firstconversion
    0