Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • You could. But make sure that the pages in there have some sort of "in-context" relevance to your pages as well. This would be better than a blind redirect. For example: abc.com/apples should be redirected to yoursite.com/fruits. No thumb rule here, just trying to make it effective for the old site.

    | ManiKarthik
    0

  • The statistics:  increased the average visitor's time by forty seconds and average pageviews by 20%.... are for visitors entering the homepage from a google search. These videos were both on multiple pages.... and the visitor had the option of viewing the video on the homepage or clicking to a deeper page that holds the video plus an article. Both of these videos were on the homepage - one at the top of the page, one at the very bottom - well below the fold.  The video at top was 45 seconds, the video at bottom was 3:00.  About 20-25% of visitors watched the top video and about 15-20% watched the lower video (data from CrazyEgg). This is a homepage with a lot of diverse information so 20% of visitors viewing is very good in my opinion.  More people clicked the top video than anything else on the page. I think that the length is not as important as the audience retention.  The 45 second video retails 90% of visitors for the entire length of the video.  The 3:00 video retains 70% for the entire length of the video.

    | EGOL
    0

  • Do you know if Google is counting the reviews based on their IP address? Most of my competitor have fake reviews and I suspect they use their own computers to do that.

    | echo1
    0

  • I apologize for the bluntness, but that is categorically false.  Google absolutely punishes sites for 'bad neighborhood' affiliations through back links.  You can find many instances on the SEO boards of this occurring.  We actually report spam sites through GWT that link to us about once per month due to a prior penalty that occurred this way. And no, it is not 'Fair' but who ever said Google cared?

    | TechMama
    0

  • Hi Steffen, You shouldn't see a decrease in your SERP's for the page the images are on, assuming you keep the alt and title tags the same. Google may even see the new URL as a update to the page, which has its benefit. If the images are ranked in Google image search, you will obviously have a drop in that image ranking. You can fight this by 301 redirecting the old image to the new image as well as updating your sitemap. Here is a great link on the subject including a response from Rand explaining it more http://www.seomoz.org/qa/view/52713/301-redirect-an-image

    | CaseyKluver
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | pgicom
    0
  • This topic is deleted!

    0

  • Unfortunately, I don't think there are many reliable options, in the sense that Google will always honor them. I don't think they gauge crawl frequency by the "expires" field - or, at least, it carries very little weight. As John and Rob mentioned, you can set the "changefreq" in the XML sitemap, but again, that's just a hint to Google. They seem to frequently ignore it. If it's really critical, a 304 probably is a stronger signal, but I suspect even that's hit or miss. I've never seen a site implement it on a large scale (100s or 1000s of pages), so I can't speak to that. Two broader questions/comments: (1) If you currently list all of these pages in your XML sitemap, consider taking them out. The XML sitemap doesn't have to contain every page on your site, and in many cases, I think it shouldn't. If you list these pages, you're basically telling Google to re-crawl them (regardless of the changefreq setting). (2) You may have overly complex crawl paths. In other words, it may not be the quantity of pages that's at issue, but how Google accesses those pages. They could be getting stuck in a loop, etc. It's going to take some research on a large site, but it'd be worth running a desktop crawler like Xenu or Screaming Frog. This could represent a site architecture problem (from an SEO standpoint). (3) Should all of these pages even be indexed at all, especially as time passes? More and more (especially post-Panda), more indexed pages is often worse. If Googlebot is really hitting you that hard, it might be time to canonicalize some older content or 301-redirect it to newer, more relevant content. If it's not active at all, you could even NOINDEX or 404 it.

    | Dr-Pete
    0

  • I tried job boards for other jobs (other than writing) an didn't have much luck. I should try again! Thanks for the advice.

    | inhouseseo
    0

  • I have virtually no direct experience with Joomla and Drupal, but your feedback pretty much confirms my inclination to move in that direction to meet the goals of some upcoming projects. All-in-One SEO seems to be popular on WP. Any thoughts on how it compare to SEO by Yoast? Thanks Robert (and Rob). This is helpful stuff.

    | ksracer
    0

  • Can I get an invite to press release point?

    | axzm
    0

  • Hi, I'm personally recommend Google Custom Search. It works great for me. Also You can find in Google many search scripts for most popular platforms(php, .aspx etc). Their will help if you don't want add you pages in Google index, or have private segments on site (personalized by the way)

    | de4e
    0

  • I think that it you have a lot of these links that you are painting a target on your back.

    | EGOL
    0

  • Given that graph, Google is certainly visiting your site. It's just not always accurate when judging when something was published. We've published posts on SEOmoz, five minutes later I look in the SERPs, and Google says "nine hours ago".

    | KeriMorgret
    0
  • This topic is deleted!

    0

  • http://youtu.be/I5qZ9GrBf4w News sites frequently place their branding atop of the video.

    | EGOL
    0
  • This topic is deleted!

    0

  • Greetings, Bede, So sorry about the line breaks. We've been having a little problem with that, and I hope my response to you isn't one big paragraph The process you have been using for your Local clients remains a best practice, in my experience. Not only does this separate the information out as best you can for bots, but it is a good practice for human users who will be signaled upon reach such a landing page that they are looking at information about their own geographic location. You are so right that things keep changing in Local, but this is one area in which creating unique, rich content landing pages is still a smart choice. In regards to Schema vs., for instance, hCard, the benefit is that Google, Yahoo & Bing have all agreed on this markup as standard. However, definitely read this 2011 discussion at Mike Blumenthal's blog in which Mike opines that rushing to switch from hCard to Schema may not be your #1 top priority: http://blumenthals.com/blog/2011/10/07/a-free-tool-to-build-geo-sitemap-and-schema-org-compliant-files/ Through 2011, I continued to use hCard on my Local clients sites without issue, but it is certainly smart to acquaint yourself with the features of Schema that apply to Local. The tool linked to from that post is a great place to begin. Hope my feedback is useful! Miriam

    | MiriamEllis
    0

  • We had the robots.txt changes & meta tags since the launch of these pages. We have internal links to these pages via <a>tags. So, should we remove the robots.txt disallows? Also, does adding nofollow on the links help?</a>

    | gaganc
    0