Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • This is an interesting article Theo.. I guess what you are saying and they way I am interpreting it is that domain authority and trust are more important. Which I agree with. However I would still think there is some milage in this.

    Intermediate & Advanced SEO | | robertrRSwalters
    0
  • This question is deleted!

    0

  • Short answer: no. Longer answer: read the following article (http://www.submitawebsite.com/blog/2010/03/does-buying-google-adwords-improve.html) on the relationship between SEO and PPC.

    Paid Search Marketing | | Theo-NL
    0

  • Jonathan, The "rule" used to be 100 links to a page based on Google having included that in their guidelines.  They've since removed that numeric value without replacing it with another number.  What I find in my large client sites where there's hundreds or thousands of products in a category is the pagination method.  The key is to ensure to append each page's Title, URL and h1 with "Page X". This is best simply because it helps ensure Google discovers all the products and properly credits them to the core category.  By trying to force all of the products onto a single page, and using any method that hides most initially for usability, you introduce the possibility that not all those products will be discovered, no matter how much Google does a "good" job at discovering links inside CSS or JavaScript. In reality, their system is far from perfect, and with all that added code, the possibility exists that you cause crawl problems due to imperfect code.

    On-Page / Site Optimization | | AlanBleiweiss
    0

  • Thank you very much, it´s all clear now.

    Technical SEO Issues | | e-Lustre
    0

  • Thanks Keri I'll jump over there as well to cover my bases

    Moz Tools | | AlanBleiweiss
    0

  • Hey Websensible, No, we haven't fixed it - because it isn't really bug - we are just grouping all anchor text counts (internal and external) together at the moment. I am guessing any similarities between the counts is just coincidence since we haven't changed this behavior. -Kate

    Moz Tools | | katemats
    0

  • I'll build on what Mike said in that comment links are less valuable than they used to be due to the proliferation of this as a spam tactic.  And I expect they'll become even less valued as time marches on.  For the most part the value is weak at best nowadays, and requires a lot more time and effort to get right given the attention this type of link has garnered from Google due to the spam issue.

    Link Building | | AlanBleiweiss
    0

  • Oh - and one last suggestion - just above the two addresses in the bottom area of all pages, have a statement about how you serve Brevard and Collier counties.  So something like "The Musil Law Firm has been serving Brevard and Collier counties since XXXX".  And have a similar but unique statement to that on your home, About and Contact pages directly in the main content area of each.

    Link Building | | AlanBleiweiss
    0

  • In Open site explorer, you can put in a URL for a specific page, and it defaults to showing backlinks for just that particular page, not the whole domain.

    Moz Tools | | KeriMorgret
    0

  • akitmane Are the errors you see pointing to pages that no longer exist?  If that's the case, the best fix that I've found with WP is use of the "Simple 301 Redirects" plug-in for WordPress.  It lets you enter the URL of the old page and the URL of a page you want people to go to if they try going to that old URL. If they're pages that never existed, the question becomes - why are people trying to get to pages that never existed? If they're innocent errors, or links from other sites where the person who made the link made a mistake in the URL, you can either use the Simple 301 Redirects plug-in for those specific links, or contact the site owner of the site that created the flawed link. Ultimately, you should have a custom 404 page set up (located in the same folder as the theme you're using for your site's template).  If that's not working, you will need a WordPress developer or consultant to help you figure out why.

    Technical SEO Issues | | AlanBleiweiss
    1

  • Glad it helped Mike.  Lots of theories and beliefs about long tail.  What I've found on several client sites is follow these rules to increase the number of long tail phrases you're found for each page: Designate two primary phrases, 2 or three words each Designate two or three highly related secondary phrases, 2, 3 or 4 words each Seed the page Title & h1 with the two primaries Seed the URL with one of the primaries Integrate each of the primaries into the content area descriptive text at least twice each in exact match sequence. Integrate each of the secondaries into the content area descriptive text at least once in exact match sequence. Use partials of those phrases at least once each in the content area descriptive text Of course the more content you write, the more you can seed phrases, but only where it makes sense to readers. Write the content in a high quality way that really sounds human Tightly group pages of content based on phrase relationships When I follow these guidelines, I typically see 30% or more increase in total phrases a site is found for. Of course it's not exact science since there are so many factors in SEO. But doing it this way, where the content really comes across naturally written can result in exponential long tail phrases you didn't intentionally try to focus on figuring out beforehand.

    Keyword Research | | AlanBleiweiss
    0

  • You can't get sandboxed for what points to you (if that was possible you would just do whatever was necessary to sandbox all of your competitors). You CAN get sandboxed for pointing into bad neighborhoods, and pointing to web wings or reciprocal linking with bad sites could fall into this category. There's no short cut. Work on getting valuable links from real sites. Give them come unique content (guest blog article, perhaps) in exchange for a link back. It's more work, but it's white hat and the link will pass real juice to your site.

    Link Building | | scanlin
    0

  • Don't do it. Yes, they will be spammy, low-quality, and pass extremely little (probably zero) page rank. The directory pages are probably not indexed and if they are they are probably devalued. And if there are 65,000 of them then they're probably all owned by the same guy, in some templated database style, with the same C-class IP address. Google is wise to this and doesn't index or doesn't value those kinds of links.

    Link Building | | scanlin
    0

  • Hi Dirk, From the changes rolled out primarily last week, no, I don't know of anyone. I do know, however, of numerous sites that are making big changes to their content presentation and structure such that they'll no longer display the problems we're thinking Panda-slapped sites have. That is, removing duplicate content (by any number of means), consolidating thin or duplicated pages, removing spun content, etc. I'd check back in a few weeks after people have had time to get these changes in place, have Google crawl their sites and update rankings. In many ways, I think it' too early to tell where rankings are going to fall.

    Search Engine Trends | | JaneCopland
    0

  • Mike's suggestion about understanding what the current value is for that site and all it's individual pages.  If there had been significant value at one time and it's dropped off, you'll need to rebuild it all over again.  Also, if you're buying old domains and replacing the content with all different content, you're essentially starting from scratch. Domain names by themselves won't provide enough value to make it worthwhile without all that new from-scratch work, and just using them for redirects is definitely not a quality SEO tactic in 2011.

    Link Building | | AlanBleiweiss
    0

  • And site speed is based off of those users with Google Toolbar installed. A tool like Firebug with Yslow can help identify things slowing down your site, and can give you another view of your site speed.

    On-Page / Site Optimization | | KeriMorgret
    0
  • This question is deleted!

    0