Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.

  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • Hi, Thanks for the reply. But..that didn't answered my question. My question is below: The same doubt is there for these two URL Structure as well. http://www.domain.com (Root domain with out trailing slash) http://www.domain.com/ (Domain with trailing slash) And if I am concentrating on my Homepage what is the best URL structure to follow whether the root domain or the domain with trailing slash. Thanks Vicky

    | vickygoal
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • You can use pagination markup that lets Google know it's one long list. There's a whole video, from Google, about it here - http://googlewebmastercentral.blogspot.co.uk/2012/03/video-about-pagination-with-relnext-and.html It's worth watching. I don't know which other search engines support this markup.

    | BenFox
    1

  • Thx Alsvik, Yes I have just done this.  We had originally set this up with Google but somehow the site owner verification for the url got removed and Google had reverted to No preference.  Hopefully this will sort the problem

    | FFTCOUK
    0

  • Hey Fergus, Thanks for writing in and sorry for the confusion! The Link Analysis data is actually supported by our Mozscape Index and is not related to the campaign crawl. You aren't seeing any metrics because your site hasn't been indexed in the Mozscape Index yet. Most new sites and links will be indexed by our spiders and available in Mozscape and Open Site Explorer within 60 days, but some take even longer for many of reasons, including the crawl-ability of sites, the amount of inbound links to them, and the depth of pages in subdirectories. We update our Mozscape Index about 4-6 weeks. Crawling the entire Internet to look for links takes 2-3 weeks, but our crawlers are always in motion. When we need to start processing, we grab all the data they have collected and start processing which can take up to 3 weeks to determine which of those links are the most important. You can see our most recently updated schedule here: http://seomoz.zendesk.com/entries/345964-linkscape-update-schedule Mozscape focuses on a breadth-first approach. Therefore we almost always have content from the homepage of websites, externally linked-to pages, and pages higher up in a site's information hierarchy. However, deep pages that are buried beneath many layers of navigation are sometimes missed and it may be several index updates before we catch all of these. If our crawlers or data sources are blocked from reaching those URLs, they may not be included in our index (though links that point to those pages will still be available). Finally, the URLs seen by Mozscape must be linked-to by other documents on the web or our index will not include them. For now, the best thing you can do to help your domain become indexed is to work on link building for links from sites with high mozrank. If you need help with that, you may want to ask the PRO Q&A community about it at www.seomoz.org/q. I hope this information helps! While the site and links may not be indexed yet, give it some time - maybe we'll see it in the index next month.

    | ChiarynMiranda
    0

  • From personal experience and from advice I've had before, you can expect google to be lenient. From what I understand, they will discredit the links, but make sure that it won't harm your site in any future updates, just make sure that you're not partaking in any suspicious activity elsewhere though. I did personally have this happen on a few separate occasions; first time round I didn't react and let it be. Then google did an update and gave me a warning on unnatural links. Got it sorted by demonstrating to Google that the links weren't our doing and demonstrated that we had tried to communicate with the site owner (found their details through Whois). Google then replied saying they had reconsidered my request and rankings went back up. The second time it happend I went straight to google with a resubmission request and explained the situation, they again came back and said they had recorded my input and no impact to site. So definately recommend going straight to them to let them know the situation Sean

    | SeanLade
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    | uzair
    0

  • Stephanie Chang posted on just this subject last month on the blog at http://www.seomoz.org/blog/how-should-you-handle-expired-content. Her post may help you with some ideas -- it was actually because of all of these types of questions in Q&A that she wrote the post.

    | KeriMorgret
    0

  • Solid advice.  AAAhhh.... reminding me I need to work on getting my content shared more easily too!

    | DeltonChilds
    1

  • It's embedded CSS, correct. Thanks!

    | AdiRste
    0

  • There is no real point in removing links. Links that have been devalued simply don't pass the benefit onto your site. The only real reason you would attempt to remove links is if you have been penalized by Google, but even then I would be dubious at removing links as Google can see that you are able to remove links - hence making it look unnatural.

    | MalcolmGibb
    0

  • There are meny crawl tools out there, but Xenu is the one I would recomend for this. It will crawl your site, and then at the end ask for your ftp details so it can specificaly check for ophaned pages. That should make the report you need. *edit - here's the link: http://home.snafu.de/tilman/xenulink.html

    | My-Favourite-Holiday-Cottages
    0

  • I used Joomla long ago, so not sure how much help I'll be... But I'm 99% sure that Joomla does not do this inherently and if they did it would actually be a feed and not a duplicate page.  My guess is that you either have a plugin that is putting those pages there or there is something funky going on with a mod_rewrite. I'd also note that on your sitemap you can see that there are other duplicate page issues (yes, they even show up on your sitemap)...those pages with trailing slashes and without trailing slashes.  There may be a way to fix those with a joomla plugin or you can check out http://httpd.apache.org/docs/2.0/misc/rewriteguide.html on tips on how to overcome the trailing slash problem.

    | jgower
    0

  • In most cases this dublicate pages do exist. This is something I had to learn. Because I was always thinking "No, that can't be that bad ..." ... and then "Oh hell ...". Google sorts duplicate stuff out. So the "indexed pages" are not a good indicator for spotting internal dup. content issues. We got on one of our websites +80% in se-traffic by just sorting the duplicates out and 301 redirecting some external linked dup. pages. So it's worth the effort.

    | softclick
    0

  • Run the URLs through scrapebox to see if they are indexed by google.  If so then maybe you just need to wait for google to index those links. they don't all get indexed on the first crawl... Probably due to the need to analyse the links and find any evidence of pyramids or wheels etc.

    | flashbangwhizz
    0