Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Thank you, Jane. You rephrased my question much better. You're correct that the old site wasn't being penalized (rebranding was the reason we moved it.) I have plenty of time to improve the new site before I need to use the old domain. Thanks again, Harriet

    | zharriet
    0

  • these pages are 99% similar to each other: http://www.copyscape.com/compare.php If you want a old school strategy like this to work there are no short cuts to creating unique content specifically tailored to that location.

    | irvingw
    0

  • The problem are, feedburner get new posts from the rss itself, and this RSS doesn't work now. The solution still be finded on wordpress or the server itself, feedburner it's okay.

    | Er_Maqui
    0

  • Does the site even need HTTPS? If not, just delete all the rules sending it back and forth, and create a 301 for each https to go to http. You could probably knock out all 132 in less than 2 hours. (I have done 100 in less than 45 mins ;)) "A uniform redirecting of http to https is not an option so I'm looking to find out the best practice for reconfiguring these 302s to 301s within .htaccess?" Yes, you can do a sitewide redirect to send all URL's to the secure version.  http://www.besthostratings.com/articles/force-ssl-htaccess.html I agree with you tho, that htaccess is a mess lol. If that is any indication of what the rest of the site has in store for you, whew.

    | David-Kley
    0

  • Hi there, Is this still happening, or does it seem to have been taken care of? Cheers, Jane

    | JaneCopland
    0

  • Hi, The good news is I've worked on very similar projects before and taking a look at your examples, you're configuring it almost  by the book except you shouldn't use rel=next/previous and a canonical. It's either/or, so you're probably going to need to ditch your canonical. i.e. http://googlewebmastercentral.blogspot.co.uk/2011/09/pagination-with-relnext-and-relprev.html However, I've configured sites almost exactly as you have and found that Google has just randomly chosen different (and multiple) combinations of page and sort order to rank in different sections. Once they get added to the index, it's a real chore to get them removed. I've learnt that if you genuinely don't want your sorted pages to appear in SERPs, you should use AJAX (and not have AJAX crawling turned on) e.g. "/?pg=1#dir=desc&order=price". Everything after the hash won't get crawled by Google.  If you can't do AJAX, then you can add noindex to sorted pages and (at your own peril) nofollow / robots.txt to stop some pages being crawled. Using nofollow / robots starts to move into crafting page rank though and IMO is to be avoided. Another approach to avoid pagination and the performance impact of very long lists is to create more subcategories to break the inventory up. Might not be possible for your inventory, but worth considering a complete side step. George

    | webmethod
    0

  • Hi, thank you. Moz  crawler found issue for my page. Duplicate Page Content Content that is identical (or nearly identical) to content on other pages of your site forces your pages to unnecessarily compete with each other for rankings That will be  onepage issue, but, for now (about one month ) it is not affect on my google position. Do I worry about it or ...? Best regards

    | komir2004
    0

  • Hi Oleg, When I click that link google asks me to check for manual penalties and when I do it shows no penalties. I guess I will disavow links anyway, so I will go ahead with that. Thanks for your help!

    | KerryK
    0

  • Thanks so much for taking the time to respond. I think I will add the https to WMT and remove them that way. I will take a look through the .htaccess file and the creation of the ssl robots file. A while back, it seemed that Google was indexing a lot of my site as https and then the dropped it and went mainly back to http. I will get that sorted to make it clear.

    | sparrowdog
    1

  • Hi Jane, thanks for stopping by and giving your inputs. You are right, no SEO tool is a one shop stop for carrying out an SEO audit. We use a myriad of tools to analyze the link profile, on-page elements with lot of manual scrutiny, Google penalty checks from GA and GWMT account perspective. The major tools that we use include but not limited to, Moz, MajesticSEO, ahrefs, SEMrush, Brightedge and Maven. Though there are many areas and features that these tools superimpose on each other, a consolidated audit data from these tools gives us a holistic picture of the issue at hand with lot of manual checks that go in to the process. Recently, we saw a situation where page loading time was an issue with all the other stuff being superior to the competition. We use a set of tools we use for page loading performance test looking at Site speed data from GA. Sometimes, the historical indexing data can also have an impact on present ranking especially when the same website operated in a different niche (or even probably operated in an illegal niche) and now serving different content with no change in ownership (Whois data). In cases this these, we look at way back machine data and try to analyze things.The list of audit checkpoints goes on and on... :)) We try to leave no stone unturned when it comes to doing an SEO audit and as you know SEO forensics is a very time consuming tiresome process especially when we have a client with better SEO parameters over the competition and crying foul about Google. Its a daily battle and the best thing about SEO is, we love it and Google is both our strength and weakness ; )) Best regards, Devanur Rafi

    | Devanur-Rafi
    0

  • I agree. I think it's primarily is a measure of the DOM efficiency of the code it self and how efficient it is when rendering by a browser. Server speed is a minimal part of it. All the caching in the world does not make it faster for to restart the rendering because of inline css. A faster responding server and download of the page only makes it possible to start faster with 90% of jobs that are relevant for PageSpeed but which are mostly handled by the browser alone?

    | DanielMulderNL
    0

  • Hi so I'm assuming your on IIS (I'm no expert on ISS I think you will need to configure the web.config) and I'm just going to step back now and get my coat as I only have experience with Apache

    | DeanAndrews
    0

  • sterlingbuild.co.uk has a poor quality backlink profile with anchor text heavily weighted towards [velux windows]. But links are still the most important factor for ranking well. Despite Penguin, crackdowns on link networks, etc., old school link building still works and this is just one of many examples of it. It's definitely not a good long-term idea, but if you need sales quick it works.

    | Kingof5
    1

  • Ah, that's a shame. It does our Images sitemaps / Web sitemaps without issue. I would have thought it would have been okay with any videos. You could ask their support to test it, on the off-chance it collates the information you need it to, they're very good and get back to you within 24 hours.

    | Whittie
    0

  • Not back during the fall of 2013, but more recently there's been a change, and it's likely because of the website vendor switch. Will the links eventually disappear from GWT? If so, do you know the timeframe?

    | EEE3
    0

  • Hi there, Just to add to what Martijn has said, you should be able to specify the pages that target the US with this tag and leave the pages for international users unspecified. This means creating different pages / URLs of course, rather than using IP detection which isn't advised for changing content.

    | JaneCopland
    0

  • I was just hired at this company about 4 months ago and for years they have been running the blog through wordpress and had about 40k visitors last year. I decided that running the blog on the website would be a great boost for SEO and lead to better conversions. The site was made from a shopify template and the social media manager hates the layout and stats he obtains. He decided that he would rather go back to wordpress and wants to convert all the posts that he created in shopify to wordpress. I was not sure if about 20 posts would penalize us on google or not? Also, if we could post to both I would still get the benefits from the blog being on the site and he would get the benefits from having it on wordpress. Thanks for your help,

    | Mike.NW
    0

  • Without seeing the pages, url structure and how similar/different the products are its a bit hard to advise. However in general if you already have a page ranking, and it is possible to overcome the confusion issues (which can probably be done through design?) I would probably leave the page there, and tweak the content to include your 'blue widget' (but remember part of the reason you are ranking for the term 'red widget' is due to the content on your page so it will still need to be optimised for this. I've done 301 redirects a few times and when done internally they usually don't cause any long-term drops, however they may cause short-term (which can last a while) your page to disappear while Google re-ranks the new page (and the exact ranking will depending on the new page URL structure, content, page hierarchy etc). Hope that helps and makes sense

    | tomwhite
    0