Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • If your using Wordpress and a plugin for showing a mobile version, you don't have to worry. But if you're simple copy & pasting on another site - then it can be a problem. Watch the vid's John links and you'll see.

    | mozalbin
    0
  • This topic is deleted!

    | Amjath
    0
  • This topic is deleted!

    0

  • I just dusted off my memory banks (the hardware ones in my brain) and if I am not mistaken (don't know if it still there) there were a paragraph in the Google terms of use that stated that explicit content has to be at least 3 clicks away from the front-page. But on the other hand I just did a quick search on porn and none of them live up to that condition so it doesn't seem like that rule is being enforced if it was ever there. It is interesting but I must admit I have never had a reason to research the subject since I have never had a client that had explicit content.

    | ReneReinholdt
    0

  • Thanks cabbagetoe. If i tweet this link again, what should my tracking code be ?

    | seoug_2005
    0

  • How about having your developers script something, that scrapes all 18.000 h1, h2, h3 for each article and store them in a database. Finding dupes then would be a piece of cake, even for a less experienced developer You could easily export all your duplicates to csv and then manually rename them based on their content. Dev time: about 1 day max. (Developed a lot of software myself and IMHO a good developer should get this up and running within 4 hours) If you don't have toooooo many duplicate tags, correcting those in question shouldn't be taking too long aswell. If you have done your chores you could reimport your corrected title-tags to the database. Your developer could write a script in the meantime, that sets the title-tag of a page according to the title-tag you stated in your database. Hope that helped If you have further questions on this, just go ahead. Had a similar problem with 25k+ pages for a major health insurance and we figured out, that the best way to prevent problems was to do most of the work manually than with a script. Helped us a lot to stay within the budget and given timeframe.

    | akaigotchi
    0

  • Thanks for the advice. There are roughly 7600 of these news excerpt page accessed from different areas of my site. A complete archive of news excerpts is accessed here: http://www.admissionsquest.com/~SchlPostedNews/index.cfm/DisplayMax/999999999 Additionally, school specific news excerpts are available from the various tabs on profiles that have connected school news RSS feeds. Here's an example of a profile & a linked excerpt: profile: http://www.admissionsquest.com/cfm_Public/pg_SchlInfo2.cfm/SchlID/842/School/The-Webb-School news excerpt: http://www.admissionsquest.com/cfm_Public/pg_SchlNewsItemDetail.cfm/SchlID/842/School/The-Webb-School/SchlNewsItemID/7764/Headline/Registration-and-Orientation-Day-Schedule In terms of them drawing traffic via search, they do. I see visitors accessing these pages via google, etc. on a regular basis. Based on what you see above, should I: 1. eliminate our excerpt page model and shift to simply displaying links to new items? Via this approach, clicking a link would take the visitor directly to the school's site. Right now, they have to visit the excerpt page before clicking the link to jump to my clients' sites. 2. add the tag  to keep them from indexing? 3.  or maintain the status quo? Thanks again for chiming in, everyone. I very much appreciate the feedback. I look forward to your responses.

    | peterdbaron
    0

  • Both yes and no. Yes, it's a linking strategy. PA 48 and dofollow-links. Nothing wrong with it. No, I wouldn't build links in that way. That notebook you linked to seemed to have bad structure and it was written spammy all over it. It would be intresting to see what other thoughts you mozzers have about this. I'm splitted in two.

    | mozalbin
    0

  • Presenting your URLs without the technology ending (i.e. html) is definitely preferable for numerous reasons. This question is frequently asked and I really should write an article on this topic. A couple examples as to why it is preferable: the .html offers no value to consumers and makes your URLs longer the .html lets the bad guys know what type of technology was used to create your pages the .html means that when you change your site to .php pages (a very common change) or any other technology, you will need to 301 your entire site which would lead to a loss of your link equity you worked so hard to build.

    | RyanKent
    0

  • Thanks for the feedback. Any advice on how to correct the issue? It displays as /~BoardingSchoolNotes/ on the root level. Do I need to 301 the first URL above to the 2nd? Seems odd.

    | peterdbaron
    0
  • This topic is deleted!

    0

  • Adding Google Analytics is not absolutely necessary but there could be a lot of advantages over your current analytics. Since Google also provides a bunch of other tools  and services like PageSpeed, Adwords, it integrates rather pretty well with analytics and could potentially give you a better view. Why not add it? It hardly takes any time to integrate (if you have a common page element) and it barely adds to site load time.

    | Syed1
    0

  • My valuation is based primarily on page authority with consideration given to domain authority (both SEOmoz metrics). I usually discover about twenty potential links without buying any and create an Excel spreadsheet with the following columns: URL Page Authority Domain Authority Price per Year With this information you can easily tell which are a good value and which are not. Make sure to keep track of the links you pass on because they may become more attractive when you've already picked all the low hanging fruit. Recently I started using Raven Tools to manage this process because the spreadsheet starts to get out of control:)

    | TheEvilchippy
    0
  • This topic is deleted!

    0

  • Oh wow, I had totally misinterpreted that article.  Thanks for clarifying!

    | john4math
    0

  • Check out this video: http://www.seomoz.org/blog/whiteboard-friday-dealing-with-duplicate-content It will give you a much more thorough answer than just a percentage of uniqueness. But if you want that kind of answer I mostly hear guesses between 20% - 40% unique content.

    | AdoptionHelp
    0
  • This topic is deleted!

    0

  • I think it's a matter of preference. Where SEO comes into play is www results getting indexed as different from non www results. As a safety measure, I might add this into your .htaccess if you are on linux: RewriteEngine on RewriteCond %{HTTP_HOST} ^mywebsite.com RewriteRule (.*) http://www.mywebsite.com/$1 [R=301,L] This will force all non www traffic to www

    | SpottedFrogDesign
    0

  • Thanks Stephen. I have done quite a bit of searching and it's the site map portion (creating relationships between pages and KWs) that I can't seem to find in a tool that is meant to tame a KW universe. I've been thinking that a custom tool might be needed, but I don't have much knowledge in the software engineering space. Funny, because I work for a multi-national billion dollar software solutions enterprise...

    | PTC4SEO
    0

  • Personally I would make it the first post in the thread. When searching and arriving on forums, people are generally looking for a specific  bit of information which is answered in that thread. Often the first post is a question, so if you can get that question to appear in the meta description in the SERPS, the user is immediately going tho think "hey, that person has asked the same question I did" and click through.

    | seanmccauley
    0