Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi Eric, That's a great question! To be safe I'd suggest adding an HTML sitemap page (for both the mobile and desktop versions) which includes links to all the pages on the site. That way you can be sure that Googlebot Mobile is able to crawl the full site. Hope that helps!

    | bridget.randolph
    0

  • Does your text make clear the meaning of the abbreviation? When we were SEOmoz.org, we never needed to add /search-engine-optimization/ to our URLs, for example.

    | KeriMorgret
    0

  • If you are using a great CMS, like DNN, you have the ability to paste as plain text. Then, style the html accordingly. If you dont, prepare to create a lot of junk code and span tags.

    | WebMarkets
    0

  • It may be down to an algorithm change, I'm going to review my Bing Webmaster account tomorrow to look for any clues. It may be worth typing site:yourdomain to see what pages have been indexed. I'm also going to regenerate the sitemap and resubmit it tomorrow. Let me know if you find any solutions.

    | chrissmithps
    0

  • I have never done a migration of a website with so many URL's it is kind of overwhelming.  I do have a 404 page with the catchall rule in place.  See i'm coming from the business side of SEO i'm not really the developer getting in there and actually doing the migration ( I tip my hat to all you developers without you I am nothing but a voice).  I'm guiding my developer to the safest route.  Doing the migration all at once does make sense to prevent both sites being cached

    | rpaiva
    0

  • A manual action is normally on links coming in not going out so you should be safe if you've cleaned your link profile etc.

    | GPainter
    0

  • trung.ngo - check out this article I posted http://www.blindfiveyearold.com/crawl-optimization that's where I got my "inspiration" from to consider using robots.txt instead...

    | khi5
    0

  • How long have the items been showing an error? Webmaster tools seems to be a little slow some times. I made a small mistake in my markup that caused me to have errors. It is now been fixed but it has just slowly been declining for 3 weeks and still has quite a ways to go.

    | EcommerceSite
    0

  • Things like this can be a little tricky to dissect anyway. Most important thing is that you're doing things right, it's just accounting for Google's ability to crawl.

    | ecommercebc
    0

  • I just wanted to point out a few issues I saw right away; Internal linking might be playing a roll. The site is not linking to these hacks pages right from the homepage, it is down a second level via the "download hacks" link. Yet the forum is being linked to universally across the main navigation. In fact the forum comes on top for most internal links --> http://screencast.com/t/MYjf3dNzZGE Bear in mind Google will treat privacy pages and sitemap internal links differently, because they know these are standard pages linked to universally, so you don't need to go removing links to those - but you should think about overall site architecture in relation to your content pages. The top YouTube video won't play - it says: "This Video has Been Removed as a Violation of YouTube's policy against spam, scams, and commercially deceptive content". That can't be good for the pages quality score. All comments below are unanswered. This is your opportunity to engage with visitors, show some extra and helpful content etc - I would respond to some of the commenting. I see 225,000 pages indexed on the entire root domain. Is there really this many pages with actual content that people are looking at and using on a regular basis? Maybe it's a good idea to do a content audit on the site and remove or consolidate low quality and/or extra pages. I am not familiar with this industry and type of site - but Google seems to be trying to rank forum pages for other domains as well - and only a few "landing pages" - as these landing pages just try to get the visitor to register, only only after they register can they get access to the content. Google is reluctant to rank pages in which the user can not easily complete the desired action on that page. Maybe this industry is out of that norm? Make sure you are not blocking CSS and JS from being crawled in robots.txt - http://www.ilikecheats.com/robots.txt - there's a lot of stuff in your file there, and Google now prefers to crawl these files, so I would check you're not blocking that. Use the new fetch and render tool in webmaster tools. You've got an infinite crawl loop happening --> http://screencast.com/t/hNekO6fx3A - I found this with screaming frog SEO spider. Something you'll definitely want to resolve. Lots of pages link internally via 301 redirects, which can also hurt crawl efficiency Did you intend to make this page a "post" and not a "page"? http://www.ilikecheats.com/01/rust-cheats-hacks-aimbot/ - see my video about the differences - either way you should be updating this page, and fixing issues as they arise to make sure users are happily finding what they need on it. Is doesn't seem like there is any one silver bullet thing to fix, but it seems like Google might be having trouble figuring out which page to rank for certain queries - so perhaps the architecture, keyword targeting etc is not clear enough. I also suspect the forum may be more trusted and authoritative due to user metrics. It's likely users are visiting the forum more often, staying longer and engaging due to the nature of the forum. But you can certainly help Google out by clarifying the site structure and page targeting a bit better.

    | evolvingSEO
    0

  • Hi I noticed a dip in traffic after my last relaunch as well but it got back to normal within 2 weeks. I can recommend doing a search in google with site:yoursite.tld to check which sites are listed. Make a Excel Sheet from that and check if you have done all the 301 properly.

    | AutofokusM
    0

  • Hi! The "Analyze Issues by Page" section actually just says that we didn't crawl that URL, not that it throws a 500 so there isn't an analysis we can provide. http://screencast.com/t/4dY8SZBs You can wait to see if that page is crawled during your next weekly update or run a crawl through the Crawl Test tool (in Research Tools). Crawl test will give you an analysis of up to 3k pages. There is a chance it still won't be crawled, but it is a way to get updated diagnostic data before the next scheduled weekly crawl. There was one 500 reported in the crawl and I can confirm it's still throwing that error, but it's a completely different page. Hope this helps clear things up!

    | SamWeber
    0

  • thx, Alan. Within real estate MLS - if I index all "MLS result pages" (ex: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/) I will have about 5,000 such MLS result pages (I mean 5,000 such category pages with each category often having more than 1 page). I have added unique quality content on Page 1 of about 300 such MLS result pages and I have added rel=next prev. For the other 4,700 pages I currently have "noindex, follow". Question: is it OK to have such a large amount of pages with "noindex, follow" on or do I run the risk Google thinks "hmmm….though we do not index, seems like a lot of crap on this website….let us lower ranking even for the quality pages." Would I simply be better off letting everything index? I am concerned if I let those pages index that will dilute the value of my high quality pages. I am thinking if I completely delete those low relevancy pages from my website it would be ideal (in order for Google to see my site's value) but users looking to buy real estate would not see as many listings as on other websites and that could be a concern. Any insight appreciated. thx

    | khi5
    0

  • It would look unnatural, and you have a point about what may count in the future

    | AlanMosley
    0

  • 1. The bonus of a subdomain is its still sort of linked to your site and can be easier to treat separate (organize) but the con is it doesn't really help boost your main domain 2. having it on your main domain e.g. /docs can help boost your main site (if people link to PDFs etc.) but can obviously increase site size (and sometimes load speeds) and can get a bit messy with orgnaising. End of the day it comes down to your preference and whats easier for the user there are pros & cons for both personally I tend to lean toward having it on your main domain just to increase its authority and value to users (and thus Google). Good luck.

    | GPainter
    0

  • Oh my God Fred!! the weekend sub-domain has been completely blocked from being crawled using a robots.txt file sitting in the root of the sub-domain. http://weekend.visitsweden.com/robots.txt User-agent: * Disallow: / Please remove the '/' from there **immediately**

    | Devanur-Rafi
    0

  • What did you do specifically to mitigate the problem? You can PM me, if you would like.

    | Travis_Bailey
    0

  • That is the best approach, 301 the old to the new. The length of time that it will take Google is dependent on a number of factors, and if the site is new, it can take a while.  Take a look at the crawl frequency in Webmaster tools to get an idea of how often Google is visiting your site. As to when to get rid of the old page - Google might not be your only source of traffic, so keep an eye on the 301s in your web server logs, and you can delete the old one once it isn't hit any more. Or you could leave it.  Ideally you would be using a CMS so that you don't need to spend a lot of time managing all this.  Most CMSs will let you just rename a page and add a 301 link to it, so there is only ever a single page.

    | HumConsulting
    0

  • Thanks Laurean, I appreciate the response. It's actually EGOL who stated that you can get double listings with long tail keywords.

    | BobGW
    0

  • Hi Andrew. Sorry its taken me ages to get back to this. but this is a awesome resource, should help you out. http://moz.com/academy/redirects

    | Chstphrjohn
    0