Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Are the keyword terms this "SEO Company" is ranking you under receiving any actual searches?

    | stevenheron
    0

  • Trust is a reasonable factor.  Consider this - Google uses trust when they determine how accurate a company's Place page is.  They go out and look to see whether the phone, address, other info are listed on other, known, trusted sites. So yes, EGOL's got a good point as well.

    | AlanBleiweiss
    0

  • Hi, SEOmoz does have a market place, many great professionals there http://www.seomoz.org/marketplace, if you want to take a look yourself run your site on the SEOmoz campaign tools, you will see errors, 404s, duplicate content, anything that could be hurting your site from within, also use the http://www.opensiteexplorer.org/, to check if the links are still there or if you left out any pages that were getting links. i don't really trust GWT that much, the data is just too weird some times: back links that don't exist, strange 404s, etc. By experience I found that every time I have some broken links, even if they are only 50-100, GWT shows me like 10.000 new errors, after they are fixed, the errors on GWT start to go down too. Some times huge design (code) changes can affect your site rankings on Google for a while, based on the an idea like: "if the people who link to you saw your site now that is different, will they still be linking to you?". I wouldn't mind to take a look

    | andresgmontero
    0

  • It's been two hours since the tweet, and already went from 5 to 33 pages indexed on google, It's possible that only the tweet helped?, or it's just that we tweeted 2 hours before the scheduled crawl haha, it's interesting

    | daniel.alvarez
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • Thanks everyone for the insights and for taking the time to help. I'm using the WP All in one SEO plugin and will continue to leave the tag pages set to noindex.

    | JSOC
    0

  • Sorry - no specific article comes to mind.  but you could pint out our answers!

    | AlanBleiweiss
    0

  • I agree with Daniel.  Page-to-page 301 Redirects are the best way to ensure each page's ranking value is passed along and then maintained with the least amount of loss.

    | AlanBleiweiss
    0

  • I use canonical references on all my pages no matter what. Most professional sites I encounter do as well. You will notice they are used on SEOmoz. I would use a rewrite rule mainly to do something alone the lines of directing all your non www traffic to their www counterpart. For the type of issue you are working on, I would use canonical tags on every page.

    | RyanKent
    0
  • This topic is deleted!

    0

  • Now it is my turn to agree with EGOL. I would add one idea. In general I prefer not to share content but it can offer value to periodically share a selected article and gain backlinks to your site.

    | RyanKent
    0

  • Any other thoughts - I'm leaning towards the 5 sites idea but perhaps I'm overestimating the importance of the domain? Thanks!

    | OzDave
    0

  • Looking to do something for a promotional products site- not sure if its a correct fit

    | DavidKonigsberg
    0

  • The surf safe information I shared isn't really relevant in this particular case as Mulith shared he operates an adult site. I added it in case others viewed this topic at a later time.

    | RyanKent
    0

  • I agree with the building out of pages that will have both the shorter url, and more reliance to said page. Tied into the html and xml site maps.  Placement is all about reliance and reputation, so of course if each page has its own SCOPE, it SHOULD obtain great placement and people will have found more of what they were looking for when they get there.....

    | SEOSHARK
    0

  • Bottom line, you cannot make data available online without offering a means for a user to grab that data. You said you "don't wish to make it easy" so I will share some ideas: EGOL's suggestion is good and not that hard to implement. I am not sure if your site requires registration but you can set it up so guests can view a maximum of ?20 member pages or whatever amount you deem to be a reasonable number. There are more complicated methods by which you can establish a script that will block any IP or user who pulls too many pages too quickly. The real challenge is your sitemap. If all that is required is the company's name, your sitemap is all someone needs. In this case there is simply nothing I can think of you can do. If the sitemap isn't a challenge, another idea is to present the data in a method that is not easy to read. You can leave the description information in HTML but present the company name in Flash, for example. Bottom line, if you want to rank well, the site has to be easy to crawl. If the crawl data offers enough information for others to steal, there is simply no reasonable method that can be used to prevent automated tools from grabbing it.

    | RyanKent
    0

  • Often I see .de domains which rank in Switzerland & Austria as well. So I think the is no need to set up a .de domain. Just try to get German Links to www.example.info/de/....

    | GregorHendrych
    0

  • Thank you for your quality comments. I think we are going to go for Evaretts suggestion since I do feel like hundreds of thousands of 301's might be a bad idea. We are also discussing the possibility of keeping the old pages (do index, do follow) up with the old ads, just removing the seller information. We believe that users might well be interested in information about past items sold or expired. This wouldn't require any 301's, just dynamically adding the message suggested by Chris and a search engine/index to help find interesting sold/expired items. This way we could generate hundreds of thousands of content pages which might over time bring plenty of quality traffic to the website. These pages would have dynamically generated fresh content (new suggestions about relevant ads) so I don't feel too worried about old content in Googles index. We could also consider scraping some details of the sold item from the manufacturer website. I've never dealt with this many pages combined with this thin content before so The wrath of the Pand worries me a bit. However the root domain is quite strong and these ads seem to draw quite a few links. The question that requires some further thought is whether or not having this many old, thin pages in Google's index will prove problematic. Thoughts? ps. Since I didn't mention this before, the website is about used cars

    | PanuKuuluvainen
    0