Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • I think you've made a smart choice, Nolan! Thanks for letting me give you my feedback and good luck with the Local SEO. It should be fun doing this for your brother. If I can be of any further assistance, please just let me know. Miriam

    | MiriamEllis
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • Here's the screen in Wordpress to check to see if you've blocked search engines through WP. http://codex.wordpress.org/Settings_Privacy_Screen

    | KeriMorgret
    0

  • $1300 isn't much money. How much do you think you would make if you rediesigned the site? We can't give you a good answer with such skimpy information. Also the decision on the 301 needs more consideration.  You need to think about relative rankings, traffic, unique link sources,  duplicate content issues, keyword reach of the sites.

    | EGOL
    0

  • Kyle, Thanks for the quick response.  The data is being displayed in the title and meta description field.  I also did some searches for specific terms with my parameter search from our site and filetype:pdf, which shows that the content is being indexed.  It also shows that the PDF titles and meta descriptions are not optimized, so I have some work there. Thanks, Anthony

    | zazo
    0

  • UPDATE - Rankings and Traffic restored beginning this very morning for us.  If something like this has happened to you... **I got a very authoritative link yesterday afternoon.  Not 24 hours passed before rankings and org. traffic was restored.  ** This was after 2 weeks of "penalty". What a relief!!

    | poolguy
    0
  • This topic is deleted!

    0

  • Hi Diane, This answer from another Moz employee for a very similar question at http://www.seomoz.org/q/linkedin-how-to-use-it-to-promote-your-business may be helpful for you as well. You might also be interested in some other discussions about using LinkedIn and other social media for small businesses at http://www.seomoz.org/q/how-can-small-businesses-use-social-media http://www.seomoz.org/q/getting-linkedin-connections More about getting connections on LinkedIn, especially in the UK

    | KeriMorgret
    0

  • Thank you for your responses. We use Endeca, but while they have a site map generator, for whatever reason they are unable to produce URLs that match our new SEO-friendly vanity URLs. Right now we've had no site map for months, as we're waiting to try and find a solution to this problem. From what I'm gathering, this is the right approach? As in, it would do more harm than good to upload a "bad" sitemap. Yes? Also, there seems to be no way to get around this with a clever redirect scheme. Am I right in this also? In which case, it may boil down to choosing between an accurate sitemap and SEO'd URLs. Not sure which would be more important. Website's here, if that's useful: www.pli.edu

    | emilyburns
    0

  • There is a field in the Dashboard of Word Press where you can place the URLs of services you want to ping and it will automatically do so when you update. However, if you would like to do so manually and hit a list of different services, try Ping-o-Matic.

    | TheARKlady
    0

  • you could try blocking via x-robots in the htaccess file: http://www.blackdog.ie/blog/remove-sitemaps-from-serps/

    | wojkwasi
    0

  • More information can be found here regarding this setting and how it is used by Google.  It will help answer your question. https://support.google.com/webmasters/bin/answer.py?hl=en&answer=44231 To be honest though I'd just update the GWT setting or the 301 so they are both the same...

    | lavellester
    0

  • Again, the real question is why are these duplicates being created in the first place? Got any answer to that?

    | AsadMemon
    0

  • No, unfortunately there is no way to prevent search engine indexation within the tags of your web page.  As you mentioned earlier in your question, you can either utilize the meta robots exclusion tag or the robots.txt file. If you are REALLY intent on blocking indexation of your promotional page and can only use the section, perhaps you can consider using an <iframe>?  For example, create a totally new page with your promotional copy and blocked by robots.txt while ensuring you have NO links pointing to it.  Then on your promotional page use the <iFrame> tag to extract the content from the robots.txt blocked copy.</p> <p>Honestly, I'm not sure if it'll prevent indexation since I've never tried it before but just an idea.</p> <p>Good luck and tell us how it goes if you do! =]</p></iframe>

    | Desiree-CP
    0

  • Oooh no thank you - I'm not a big risk-taker when it comes to SEO. he-he. Thanks again for your help!

    | Improvements
    0
  • This topic is deleted!

    0

  • Thanks for your reply John. Yes, you are right about XHR related content. What I was asking was about content being injected into the DOM (Document Object Model) using scripts which are loaded asynchronously using the async attribute. I believe this is a different case.

    | phaistonian
    0
  • This topic is deleted!

    0