Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: On-Page / Site Optimization

Explore on-page optimization and its role in a larger SEO strategy.


  • Takeshi, that was great thank you

    | Jon-C
    0

  • How I would handle this is try to strike a compromise. Ask them if it would be OK, it all the pages were separate but they functioned as in one single page.  Then build each of your pages, all with titles etc.  and use a single pages ajax container to load them all. You will still need to find a way to work in links to the individual pages as search engines (I believe) are still having some issues with crawls and it can be a bit of a pain to users because the pages are never actually visited, they are never registered in history for in browser functions like Back. The technical end of it is actually pretty easy, just look up jQuery and Ajax for single page web apps or etc. [http://css-tricks.com/ajax-load-container-contents/]  or the jQuery website would be the place to div into that :)!

    | yeagerd
    0

  • well not exactly you are looking for but still for magento all i have got is this one http://www.fmeextensions.com/catalog/product/view/id/76/s/magento-link-exchange-directory-pro/category/6/ http://www.fmeextensions.com/catalog/product/view/id/87/s/advance-sitemap-seo-suite/category/6/

    | MozAddict
    0

  • - corporate website pages - I would probably focus on these if they are providing conversions. - landing pages for specific content offers that are gated behind a form - Do these landing pages even rank since they can only be accessed by filling out a form? If so, then this would be last. - blog posts - Check you Google Analytics for the blog posts that are bringing in conversions (Time on Site, Sales, Leads, etc). I personally would let Google Analytics "show" me what to focus on 1st. I would make the pages that are already doing good become better.

    | Francisco_Meza
    0

  • Great point - thank you!

    | Freelancer13
    0

  • Well "cheap shoes florida" is not correct grammatically, so right off the bat that should be "cheap shoes in florida" Your sentence above is spammy and painful to the reader and targets four different regions on one page. If you want to go the region route, then each page should target a different area. Especially if the main keyword is medium or high competition, you'll need targeted dedicated landing pages with unique content in order to rank for state long tail searches. This takes care of your spammy copy, because you're only including one state in the content, and it also really needs to be under optimized and only a few times in the copy - don't overdo it. Definitely needs to be in the title tag and H1 tag.

    | irvingw
    0

  • I have a great example. For the past few months my client was ranking organically on page one for some high level attorney key phrases. We are a web content company working with an SEO and in efforts to stay ahead of the game we made some minor changes to some geo specific key phrases and in less then a month we almost disappeared off the map. The website I am referring to is www.wolfandpravato.com and one of the culprits with thin or no content beating us in the SERPS is www.representingtheinjured.com among others. If you have any feedback or are willing to do a test or case study I would be happy to provide more data. I hope to hear your thoughts soon. We also started doing PPC for the client. Thanks. Alex

    | WDWC
    0

  • Hi Kurt, Are thoe syndicated contents on your blog affecting your ranking?  If not, it should be safe to keep them but removing them will be a safe bet in case it hurt you in the future.  Never know when Google will push something new.  However, two "cons" i find from doing that might be losing some link juice if some sites actually linked to those syndicated pages and having fewer pages for Google and search engines to crawl.  However,if you decide to remove them, Great! Google will do its own thing and crawl your site.  However, if you want to let Google know and ask them to crawl again, you can go to Webmaster tool (www.google.com/webmaster) and use the "FETCH AS GOOGLE" tool.  This will tell Google to crawl your page again. Hope this helps.

    | TommyTan
    0

  • You are very welcome - happy to help

    | Andy.Drinkwater
    0

  • Hi there I have a similar problem using Weebly. One can only add tags to the home page. My url is www.positiveimpact4health.com Any ideas how to add the tags to each page? John

    | JohnIreland
    0

  • Hi Tom, Thanks that helps, i did have an idea it was that but i needed it confirming. Regards

    | OasisLandDevelopment
    0

  • Hey Jozef You know, I have not come across this issue with PDF's causing problems as duplicates but ultimately, a duplicate is a duplicate and if you want to republish this content then it would be worth examining the reasons for doing so. Is this purely, as you say, to give access to a special offer and is something purely to benefit your users and not in any way for driving more search traffic? If so, I would just create a folder for these pdfs and block that in robots.txt. If your folder was simply called pdfs/ we would need something along the lines of: User-agent: * Disallow: /pdfs/ This way you can use these PDFs and make them available without ever having to worry about them causing any kind of duplication issue. Additionally, PDFs do sometimes get indexed and can make pretty poor landing pages, especially if they have the branding of another company and no navigation to get into your site so again, this is a good thing. If you are looking to use these to generate more exposure then create a unique landing page, add the link to the PDF but ensure the PDF is blocked in robots.txt. You can also issue meta no index (REP tags) via your webserver in the HTTP header to block out specific files or file types but I would imagine the simple robots.txt and a directory solution is a perfectly suitable and easier to implement solution in this case. Hope that helps! Marcus

    | Marcus_Miller
    0

  • James, that is definitely a good thought. Consider adding EXIF data to your images or add more accurate data including GPS data if you can, depending upon nature of data. I am sure, it will start getting used more and more in times to come.

    | NakulGoyal
    0

  • Hello Mr. Smith (I love saying that) Taking a look at the URL you provided: http://www.keepitpersonal.co.uk/silver-personalised-noahs-ark-money-box-p-108.html.... ...I wouldn't guess that keyword stuffing is a cause for concern. In fact, I like the recent engravings box as it adds a nice touch to the information on the site. I also didn't find it spammy at all. The design is a little "busy" and perhaps outdated - sometimes with edge cases this is enough to trigger a Panda update - especially if you have poor user engagement metrics, but I couldn't find any duplicate content and overall the site seems structurally sound. I know you're hesitant to blame the link profile, but 80% of the time it's the culprit in these cases. The difference between what you and your competitors are doing is not always fair, but more based on what Google defines as "unnatural" at the moment. I ran your URL through Remove'em link analyzer - http://www.removeem.com/ which loosely identified 285 "suspicious" links. You can do the same sort of analysis with these 2 other tools as well: http://tools.seogadget.co.uk/ http://www.linkdetox.com/ Looking at the backlink profile in Open Site Explorer, we see a lot of comment signature links and directory listings, both types of links that were targeted by Penguin and/or over-optimization filters. I'd check Google Webmaster Tools for any messages and/or errors. Even if you didn't get a link notice, you can file a reconsideration request to see if they will verify a penalty. Sometimes this will get you more information then you had before. In any case, it might be well worth it to do a complete link audit and cleanup, and even go through the disavow process if necessary. Just my 2 cents. I know it's probably not the best news you want to hear, but I really like your site and want to see you succeed!

    | Cyrus-Shepard
    0

  • Hi Guys Thanks, very helpful. We are an ecommerce site and an information site. So I guess we will go with publisher on the homepage and author for content pages. Though we have tried author on product pages and G displays the author photo in the results. Not sure what effect that has, but I like to see a face related to the product. (We sell information products - legal documents). It also helps our product stand-out in the results. Thanks again. Patrick

    | dexm10
    0
  • This topic is deleted!

    0

  • I agree with donford. I've never heard of duplicate images being flagged as duplicate content. What is more, almost everyone sets up their galleries in this manner, so you really don't have anything to worry about.

    | UnderRugSwept
    0