Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I have seen and achieved this with many keywords. Normally the less competitive a keyword is the less strict google is with eclipsing multiple listings.

    | kchandler
    0

  • Dr. Pete, I just ran across one of your webinars yesterday and you brought up some great ideas.  Earned a few points in my book Too often SEOs see changes in the rankings and react to counter-act the change.  Most of the time these bounces are actually a GOOD sign. It means Google saw your changes and is adjusting to them.  If your changes were positive you should see positive results.  I have rarely found an issue where a user made a positive change and got a negative result from Google.  Patience is a virtue.

    | inhouseninja
    0

  • Oops, I forgot to update this question after I came to a conclusion. Thanks for answering! The redirects I ended up with looks like this: RewriteEngine on RewriteRule ^old-product.html$ http://www.new-domain.com/new-product.html [R=301]  RewriteRule ^old-category$ http://www.new-domain.se/new-category [R=301] I tried first to write them the way you suggest but it didn't work the way I wanted it do for category pages (or folders). And I generated an .xml sitemap for the old site first that I used as a list of all URLs to redirect to so I think I covered them all.

    | mrlolalot
    0

  • If you search for royal kona resort It comes up. that is because it is optimized for the term. For a start look at the titles and and domain names it is obviouce that one is going rank higher then the other

    | AlanMosley
    0

  • Humble and modest EGOL, but we all know you should be on that list too!

    | SteveOllington
    1

  • Robots.txt files are sequential, which means they follow directives in the order they appear. So if two directives conflict, they will follow the last one. So the simple way to do this is to disallow all files first, then allow the directory you want next. It would look something like this: User-agent: * Disallow: / User-agent: * Allow: /test Caveat: This is NOT the way robots.txt is supposed to work. By design, robots.txt is designed for disallowing, and technically you shouldn't ever have to use it for allowing. That said, this should work pretty well. You can check your work in Google Webmaster, which has a robots.txt checker. Site Configuration > Crawler Access. Just type in your proposed robots.txt, then a test URL and you should be good to go. Hope this helps!

    | Cyrus-Shepard
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | dan555
    0

  • Here is the link for the social development guidelines from facebook http://www.facebook.com/FacebookPages?sk=app_283788381634959

    | fassa
    0

  • The extra 70 characters can help - Google DOES index them: http://www.seomoz.org/blog/title-tags-is-70-characters-the-best-practice-whiteboard-friday If you can, make sure the first 70 characters are most relevant to the page content, and readable for the user - if they don't like what they see or it doesn't make sense without the other 70 characters they're unlikely to click through. You don't mention whether your brand name is included in the tag; unless you're really well known in your niche it usually makes sense to have the brand at the end of the title tag (if at all). I'm an SEO for a couple of news websites where the titles are often well over 70 characters (with the brand names at the end). It doesn't seem to cause any serious problems and we have had high traffic from SERPs with longer titled articles. I advise the editors to put the main keywords near the beginning of the title if possible, with readability (for the human!) being the most important thing.

    | Alex-Harford
    0

  • Hi, For the technical part the best would be to read the information on the microformat hCard itself. Which could be found here. To speed up your process and implement it within your clients Web site you might also want to check out he hCard creator tool. Which outputs the microformat(s) for you in the right way. Good luck!

    | Martijn_Scheijbeler
    0

  • Thanks for the clarification Keri. Always helps.

    | RobertFisher
    0

  • Disregard I figure it out.

    | greenleafy
    0

  • So you can use the MozBar to set up local searches when you're not local. I'd suggest building out content in subfolders on your site, instead of being microsites to link to your larger domain: www.example.com/asheville Why would you want to put content on your microsites that people are going to want to link to and then only have one link pointing from them to your main domain? Additionally, Google is smart enough to realize that you own all the domains and you could get penalized. And, as someone who spent years in e-commerce, having more than one domain to maintain (when it sounds like you have one domain you're taking care of) is a pain in the butt.

    | EricaMcGillivray
    0

  • Yeah, essentially what I'm doing is moving www.blog.com to www.othersite.com/blog in order to give othersite.com the blog's google juice.

    | brianmcc
    0
  • This topic is deleted!

    0

  • This is why I love this forum. We recently started seeing these urls in our GWT report. We have hundreds of truncated urls that end in "..." that go nowhere. We can't figure out where these are coming from. We thought it could be G's relatively new privacy policy w/ not passing along the data, but we're not sure. Anyone have any thoughts on that? Thanks!

    | Improvements
    0
  • This topic is deleted!

    | mj775
    0

  • We've taken the approach of adding rel="canonical" to all paginated content with the link pointing to the first page of results. We keep everything indexed and followed. We also help Google identify the URL parameter created for paginated content in our Webmaster Tool settings. This has worked well for us, but another approach would be to add rel="next" and rel="prev" to the paginated links. This is openly supported by Google to handle pagination, but might be a little tricky to setup in your CMS. Good luck! Andrew

    | dunklea
    0

  • I snooped around in the Google Webmaster Tools help section and it seems like a lot of other people have faced the same problem with no solution offered. Shame on Google! As much as possible, I would go back to all the sites that were linking to your old domain and ask them to update their links. 301s pass most of the link juice, but not all of it, so it's worthwhile to save as much as that as possible. It also helps Google start to ignore your old site and focus more on your new site. This is all probably a lot of work, but I hope it works out! Good luck. Andrew

    | dunklea
    0