Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Thank you for the info. We are just learning and testing using the language of seo. Your answers have supported the direction we felt we needed to go but wanted a second opinion from someone who understood the whole picture. All the Best... Beth

    | BakerBayBeadCo
    0
  • This topic is deleted!

    0

  • Hi Vinnie, Google only counts the first link in regards to Anchor text - although there appear to be ways around this if you care to do so, but for the average person, yes, only the first link: http://www.seomoz.org/ugc/3-ways-to-avoid-the-first-link-counts-rule

    | beso
    1
  • This topic is deleted!

    0

  • Only way is to wait, give ti some time! Google should re-index you soon, and your rank will start to stabilize once again.

    | Blargh123
    0

  • Hey there Eric, I can't think of any right now but you did remind me of this great blog post by Richard Baxter that you may find useful: http://seogadget.co.uk/solving-site-architecture-issues/

    | Hallam
    0

  • KB's are great for user generated content (just look at this one). If it was open to all just imagine how many links it would be attracting!! Obviously your KB would be for SEO benefits, and therefore the best method for gaining from it would be to host it on the clients domain in a sub folder. Most open source CMS's can easily be modified into Knowledge bases with relative little ease - one thing I should recommend is heavy moderation on all posts to ensure that no incorrect answers or spam goes live, as this would make their business look poor. Regards Aaron

    | aarondicks
    0

  • In Latam I use Dattatec.com

    | arielblanco
    0

  • You probably already know this but there's a FF plugin called Google Global which you can use to search from other places if you wanted to check the differences easily

    | SteveOllington
    3
  • This topic is deleted!

    0
  • This topic is deleted!

    | ekrock
    0

  • On the topic of a private crawl (or distributed crawl), these are cool ideas, but not something we currently have in our plans. Having the crawl centralized allows us to store historic data and ensure polite crawling. This may take a little extra time (we are indeed doing a lot of crawls, as well as processing them and retrieving link data for each of them), but we are actively working on on our infrastructure to reduce our crawling and processing time. While the first crawl does take a number of days, subsequent crawls are started on the same day each week, and should take roughly the same amount of time to complete, controlling for external factors. So in general you should have fresh crawl data right around weekly, give or take a day or two. As for your specific crawls, I'd be happy to look into them for you. I'll send you a separate email to discuss.

    | adamf
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | Srvwiz
    0

  • Shared IPs can cause issues with reverse lookups, which could make spiders confuse you with sites sharing that IP. Admittedly, it's rare, but there have been cases of penalties jumping within an IP, for example. I suspect that, as IP sharing grows, Google is getting better about this, but I generally like to avoid it.

    | Dr-Pete
    0

  • Dave, Thanks for the clarification.  You're definitely in a rare circumstance as compared to most web sites. In reality, since it's the Bible, there is going to be a duplicate content issue regardless, given how many sites currently and how many more will most likely publish the same content now and in the future.  From Eternalministries.org to KingJamesBibleOnline.org, concordance.biblebrowser.com, and so many other sites are all offering this content. If you can find a way to offer your content in a unique way, and within your own site, offer different versions of it (individual verses compared to entire chapters), then ideally yes, you'd want it all indexed. How you do that without adding your own unique text above or below each page's direct biblical content is the issue though. Given this challenge,this is why I offered the concept of not indexing variations.  Even if you weren't hit by the Panda update, any time Google has to evaluate multiple pages  across sites where the content is either identical or "mostly" identical, someone's content is going to suffer to one degree or another.  Any time it's a conflict within a single site, some versions are going to be given less ranking value than others. So unfortunately it's not a simple, straight forward situation where duplication avoidance can be guaranteed to provide the maximum reach, nor is there a simple way to boost multiple versions in a way to guarantee that they'll all be found, let alone show up above "competitor" sites. This is why I initially offered what are essentially SEO best practices for addressing duplicate content. If you don't want to lose the traffic you have now that come in by multiple means, the only other way to bolster what you've got already is to focus on high quality long term link building, and social media. The link building would need to focus on obtaining high quality links pointing to deep content.  (Specific chapter pages and specific verse pages), where the anchor text used in those links varies between chapter or verse specific words, broader bible related phrases, and the LDS brand. On the other hand, by implementing canonical tags, you will definitely reduce at least a number of visits that currently come in by variation URLs.  Will that be compensated for by an equal or greater number of visits to the new "preferred" URL?  In this rather unique situation there's no way to truly know.  It is a risk. Which brings me back to the concept that you'd potentially be better off finding ways to add truly unique content around the biblical entries.  It's the only on-site method I can think of that would allow you to continue to have multiple paths indexed. Combined with unique page Titles, chapter/verse targeted links and social media, it could very well make the difference. With what, over 1100 chapters, and 31,000 verses, that's a lot of footwork.  Then again, it's a labor of love, and every journey is made up of thousands of steps.

    | AlanBleiweiss
    0

  • I have wondered about this - my impression is that since Google is quite insistent on tracking fully-qualified sub-domains that the impact of creating CNAME records for a related sub-domain would be limited. Our blog is hosted via dreamhost, whereas our store is on Volusion - a prepackeaged shopping cart solution. To implement a blog in the same domain, we'd have to create a CNAME record like blog.mysite.com and of course, have a link to it from www.myseite.com. So far, I've always believed that this would only allow for - as you said - "a portion of the value" of the links... is there any reason to believe otherwise?

    | AspenFasteners
    0

  • If you have a windows server, you cannot run php on those pages. You need a Linux server for this. This means you cannot put PHP in a subdirectory or folder. They are correct that it would have to be a subdomain, hosted on a linux server. You can redirect a URL from your subdomain to a subdirectory URL, but if the content is in php, you cannot put the same content from the subdomain into a subfolder. Using a subdomain is not the end of the world. It is not ideal, but it won't kill you. Your best bet would be to try and change your hosting to be on a linux server, then use a subdirectory. If not, you have no choice really.

    | DanDeceuster
    0

  • I've put up a new site recently that I had been link building to during development. It got indexed in a strange way - just the main keyword, hyphen, domain name... I had it set to noindex but Google obviously followed a few links to it to see where they went. I'm hoping this will clear up for me once the site is cached. I assume you're experiencing another quirky Google ranking script, so it should be all cleared up on the next index of your site, regardless of the 302's actually supposed to be 301s Aaron

    | aarondicks
    0