Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • None of those changes have been done, and unfortunately I can't find out why the previous guy has put the disallow on those files. I will give a couple of weeks time and see if anything changes, if not I will remove it from the robots! Thanks for your help! Oscar

    | PremioOscar
    0

  • Thank you for the excellent advise, everyone!  I feel much better now.  Was worried I was missing out on an opportunity but feel comfortable now staying where I am.  THANK YOU!

    | eqgirl
    0

  • Having too many 302s won't necessarily harm rankings in-and-of itself. However, don't expect the pagerank from those old pages to be applied to the new page / home page. Also, redirects, whether 302 or 301, which link from a specific page to the home page are often treated as 404s anyway.

    | Everett
    0

  • Weird. We were having a problem where lots of our skill pages were getting our <noscript>text used as page descriptions on Google SERPS.  We added these comments, and Googlebot reverted to using our meta description as the page descriptions in SERPs.  It could have been a freak coincidence that Google stopped using our <noscript> text right after we implemented the tags, or possibly Google was (possibly accidentally) supporting them for web search awhile back when we originally did this, and now has stopped supporting it.  Anyways, our SERPS remain clean of our <noscript> text today (<a href="https://www.google.com/search?q=site:www.ixl.com/math/grade-5" target="_blank">example</a>).</p> <p>John Mueller recently commented on that Quora thread saying it won't do anything for web search, so IMO that puts this to rest.</p></noscript>

    | john4math
    0

  • I have just been looking at this same problem and from what I can figure out, the 2 methods you mention above, separate pages and Ajax pushstate are the best (maybe only) way to go

    | agua
    1

  • I agree with EGOL. I think that a new design should improve User experience on the site and all related metrics. In this case we're speaking about the same list EGOL says, from bounce rate to conversion rate and from time on site to internal navigation. This should be tested using an A/B platform or whatever ensures you're not changing from one day to another without knowing the effect of the new design. This requires good tools and time (optimizely may be one). A new design may also improve SEO rankings and in my opinion this means, all changes made in development of the site: SPEED: improving CSS and javascripts, using of sprites and the things which may help you speeding up your site (google loves that!) URL REDESIGN: if URLs are completely messed up it could be good to review them and write more user friendly URLs. BEWARE! this may cause your rankings to dance a bit, due to 301s and changes effects. 301s can be set up in a wrong way and even if set up correctly may lose some value of the old page, so if you really NEED to update your URLs structure be sure of the benefits of that. If you're not sure DON'T CHANGE THEM. EXPERIENCE: It's a really great opportunity to make the most of mobile increasing traffic and build a responsive UI, compelling with experiences throughout all the devices. All of the above can be an additional value but consider design as an effect mainly focused on improving user experience.on the site rather than googlebot. Try to improve user experience metrics and to not HURT SEO this by maintaining the code as it is, improving it but not changing it, without changes on site contet, without changingn meta data and URL (if you don't have a clear plan on how to take advantage of that change), hope this helps!

    | mememax
    0

  • Thank you. That thought process did occur to me at the time of adding the robots txt, however that was back in March and the impressions drop has happened this week. I have updated the robots.txt to show the new disallow command rather than "/search". I also just installed my last back-up which was just before i added Google comments. Thanks for your help!

    | Silkstream
    0

  • Hi Federico, Thanks a lot for your response. Much appriciated. I will start implementing this and see what happens. Regards.

    | B.Great
    0

  • OK, so I think my last paragraph might be the best option. Let me explain. Is the content also hosted on a master site that you own. Or is it only available on the sites you distribute it to? i.e is there a main source? If yes, I would re-write the content taking small snippets of text from the original article and write a few paragraphs, this way you can reduce the amount of work you need to do. Then I would link to that main source but using a nofollow link. So essentially you have all your sites with some unique content that references the master piece of content in some way and links to it for further reading if need be. If you no index the content then those sites will never have a chance of ranking for those key phrases. If there is no original source of that content hosted on a master site somewhere then re-write the content or noindex it. All those sites hosting the exact same content is not good for any of them.

    | gazzerman1
    0

  • FYI, in this screenshot, I am seeing in the Google cached version of the site the "About", "additional info", "contact", and "media" pages.  But I do need to click on those pages to make the content appear. To Google and other search engines, these are not separate pages, but content that is served within the same page.  The URL doesn't change at all.  If you wanted to have those pages indexed, I'd recommend creating them as separate pages, with links that open up in a new page. That said, you might get penalized for duplicate content if you have all of the same content on the page, but list this information below. Another idea would be to keep the left hand navigation for the About, Additional Info, Contact and Media, but have all of the content display on the page; just link to the content from the top. The way you have it built does limit the page length, but the user experience may be confusing to some, especially on a touchscreen tablet. google-cached-result-toronto-auto-vault.jpg

    | customerparadigm.com
    0

  • We ran into this in the past, and one thing that we (think) happened is that the links to the dev site were sent via email to several gmail accounts.  We think this is how Google then indexed the site, as there were no inbound links posted anywhere. I think that the main issue is how it's perceived by the client, and if they are freaking out about it.  In that case, using an access control password to prevent anyone from coming to the site will limit anyone from seeing it. The robot.txt file should flush it out, but yes, it takes a little bit of time.

    | customerparadigm.com
    0

  • Hi Easigrassne, Coming up in Google's local pack of results is dependent upon you having a physical location in the city you're hoping to rank for. If you want to rank in the local results for three cities, you must have a physical location in each. You can then build a landing page on your website for each of your three physical locations and link your three Google+ Local pages to the respective landing pages on your website. If you do not have physical locations in each of the three cities to which customers come to do business, then you must go after organic rankings via developing content surrounding these locations and your services there. If your business is an SAB (service area business) like a landscaping company, a blog is a great platform for publishing this type of content. For example, you could write a post about lawn installation in City A, perennial border landscaping in City B, and pond building in City C. You can continue to showcase your projects on an ongoing basis, devoting a new blog post to each completed project and developing a library of work that can begin ranking in Google's organic results.

    | MiriamEllis
    0

  • They really aren't providing any traffic and the content is poor, and not written by me. I generally answer these questions by thinking about the user experience.  I would delete anything from the site that does not provide a good user experience.  Google devotes considerable resources to determine which sites provide a good user experience and we can assume they will continue to get better at this in the future. Best, Christopher

    | ChristopherGlaeser
    0

  • If the webserver does not support (or the admin does not want to enable) this feature you could always have your frontend-templates have a small string wich holds the date/time when the page was last updated. Something along the lines "last updated on: ...." at the bottom or top of the content area. It's also an useful bit of information for users.

    | jmueller
    0

  • Hi Amanda You could just change your single pattern matched 301 to point all links to your old site to the home page of the new site. That way at least someone who clicks on a link to your old site will find the new site. Also without a 301 for all the links on your old site those old URLs are going to sit around in the indexes of search engines for a long time. Peter

    | crackingmedia
    0

  • Hi Nick, I would say you need to accomplish: a) Getting the company to get a new phone number b) Getting the developers to put a landing page for each location on the site c) Building new citation for the new location, not piggy-backing onto citations for the old company. After all, despite the fact that The Car People occupy a building that was previously occupied by another business, there is no relationship between the two (or, at least, there shouldn't have been, if not for that decision to keep the other company's phone number) d) Tell the client that some of the decisions that have been made are going to make it essential to have a lot of patience here while you try to create a data cluster out there on the web that Google can trust. Right now, it's unlikely that they have this. It's going to have be created over time with a lot of care.

    | MiriamEllis
    0

  • Oleg is correct, you need a 301 to solve this ASAP. However, this method only works for Apache servers. You need to determine what type of server the domain in question is being hosted on. It will either be a Linux based Apache server or a Windows IIS environment. If it is the latter your method of redirecting will be entirely difficult. Google "how to 301 in IIS" for instructions, but really your hosting provider should be able to do this for you.

    | jesse-landry
    0

  • Hi There Karl's answer was fantastic - and I just want to emphasize as well - if this is decently strong brand I would put some more effort into recovering from the manual penalty. As tedious as this can be, if it was just a "churn and burn" site I'd say just axe it and re-build. But go to Keyword Planner - how much brand search volume does the company receive? Try [brand] [brand.com] [www.brand.com] etc - if this number is at least in the hundreds or thousands of brand searches a month, I'd work to remove that penalty and let the brand signals help you recover. Also, does the main site receive a lot of direct traffic? It just sounds like if you can get the penalty lifted it will be an easier recovery than just some exact match domain that's not really a brand. -Dan

    | evolvingSEO
    0

  • thanks guys.  Very helpful

    | yogitrout1
    0

  • Just following up on this one. I've just gone through this process - http://youtubecreator.blogspot.co.uk/2013/04/using-google-page-identity-on-youtube.html for Distilled, and It does work. It's not perfectly elegant, but will do the job.

    | PhilNottingham
    0