Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • I had the same problem on seomoz report for duplicate page title, especially on pages. I place a rel canonical,  to main page http://www.clearviewtraffic.com/page1 (for ej.) to http://www.clearviewtraffic.com/ It is as google wants, but seomoz keeps giving error  but i think does not means that the web as a problem. Maybe seomoz Gurus can help on this issue, this is my case with seomoz erros

    | maestrosonrisas
    0

  • Agree with Tim.  The key is that a 302 redirect is universally accepted as a temporary redirect header response - that link will (should) not last.

    | MickEdwards
    0

  • You may just be experiencing the delay that happens between crawl time and analysis time in GWT.  I haven't seen anything official about this, but in my clients' sites and my own sites it appears to me that it can take as much as a month after a recrawl of a page for Google to respect a 301 or rel=canonical in terms of what it reports in Webmaster Tools (and also what it shows in the search results). If you've carefully checked the complete set of HTTP responses and only see a single 301, then I would let it site for another 2 weeks and see if the problem goes away.  FYI I like to use HTTPfox to check HTTP responses, as I can see the complete chain of responses--not just the last or first in the chain.

    | MichaelC-15022
    0

  • Well, most certainly Google would consider that to be cloaking on the tumblr subdomain.  I'm not sure that the algorithm would spot this and find it to be a problem, but if I were a Google spam engineer, and asked to look at this, it would be very hard to convince me that it wasn't set up deliberately to build traffic to your site.  I would disavow that tumblr subdomain (the subdomain only, e.g. keywordxxx.tumblr.com, DO NOT disavow tumblr.com!).

    | MichaelC-15022
    0

  • Penguin:  those links aren't your problem.  As Matt states in this video, they're already handling all of the major affiliate programs automatically (and in most cases, you'll see the affiliate program implemented as non-301 redirects anyway). Panda:  Panda is not a link-based measurement--it's a measurement of site quality based on on-page content, layout, etc.  You're talking about inbound links from your affiliates, through an affiliate program, TO your site....so not an issue.  Having said that, if on YOUR site you had a ton of OUTBOUND links that was occupying a lot of the above-the-fold real estate on your pages, now that COULD affect how Panda sees your site (in a negative way).

    | MichaelC-15022
    0

  • Hi, If all the content you want to remove is part of a separate folder / directory then use the approach from: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427 Except from the use of robots.txt - set all those pages to noindex - don't use robots.txt ! (very important) In case you have some pages in the same folder that you still need in the index but this number is low as ratio vs the ones you want to remove - is better to remove it all and the index those back then move one by one and remove the ones you don't need - it's all about the numbers. Cheers.

    | eyepaq
    0

  • Ciao normaly  Google read every day the sitemap, for this is better upload a really sitemap and avoid that index some pages with a robots.txt Maurizio

    | malecce
    0

  • Hi Marshall, I do not see a spammy link structure in fact if that is your top ranking URL it can be re-created using word press so it is exact. I believe that the reason that URL is so powerful is because of the great content you put in the page. Not because of the URL structure I believe you would rank as well if not better if you were to use a single parameter however it seems that it is really up to you as I think it would be faster  when the bots crawl your site to have one parameter Remember if you 301 redirect that link into another that is identical or change the structure you will still be sending all that page rank to the new URL as long as the new URL has all the things needed to rank for local you will rank regardless of the URL structure. Here is some more information and take a look at the way the URL is made. http://yoast.com/articles/wordpress-seo/#structure This is working well however you can build a two parameter link using the site structure that shows simply http://example.com/`%postname%/`. I have some news. I should've remembered this off the bat, but WordPress will not allow you to redirect a link and not follow it's designated perm link style. the preferred method is to have http://example.com/`%postname%/`. it is clean and you are talking about having very long pages that would be a way to make more pages and also keep your keywords in the URLs you may opt for http://example.com/`�tegory%/%postname%/`. however this would make every single link have a category or two-stage parameter this tool may come in handy if you have some difficult URLs http://yoast.com/wp-content/permalink-helper.php you cannot pick both. Without a custom plug-in however from what I've scene and what I've read this is not ideal. You want to make a decision and stick with the link style. http://yoast.com/wordpress-seo-url-permalink/ Out of the box, WordPress is a pretty well optimized system, and does a far better job at allowing every single page to be indexed than every other CMS I have used. But there are a few things you should do to make it a lot easier still to work with. 1.1URLs 1.1.1Permalink structure The first thing to change is your permalink structure. You’ll find the permalink settings under Settings → Permalinks. The default permalink is ?p=, but I prefer to use either /post-name/ or /category/post-name/. For the first option, you change the setting to /%postname%/: [image: Permalink-Settings.jpg] To include the category, you select “Custom Structure” and change the value to/%category%/%postname%/. If you previously had ?p= as your permalink WordPress will take care of all the redirects for you. This is also true if you change from/%postname%/ to /%category%/%postname%/. If you change from any other permalink structure, you might want to consult my article on changing your WordPress permalink structure and the tool that you’ll find within it. P Now you can pick from a group http://yoast.com/wp-content/permalink-helper.php but I do see one that could possibly be broken into new pages instead of adding on a parameter. And last were talking about tagging a page or using a category in the blog. http://www.screamingfrog.co.uk/seo-spider/ Awesome tool. Pttp://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687&topic=2371325&ctx=topic http://www.seomoz.org/blog/seomoz-for-local-optimization I will send you my info now and please except my apologies for taking so long and getting back to you. I got four phone calls and row and had to take them. Sincerely, Tom

    | BlueprintMarketing
    0

  • If a page uses parameters to build the content say www.example.com/topic?page=1, www.example.com/topic?page=2 and so on, the URLs aren't the same, therefore each page will be indexed as a different page. Both options you mention will create a huge set of new pages, the query parameter one it's ok, but an URL that has text describing the content is more user friendly. Also, both methods could be used to develop it, using some ASP and URL rewriting it shouldn't be much different. I personally would go with an URL that describes the content, like: www.example.com/topic/subtopic

    | FedeEinhorn
    0

  • Hi Matthew The IP location it's a signal although it used to be stronger in the past but is still is. Take into consideration that is not the only signal to inform Google that your site is relevant towards a specific country: The usage of a ccTLD or geolocating in Google Webmaster Tools, plus specifying the region also in the hreflang annotations and your content itself: adding the location where it's targeted to in the different elements of the pages, are also additional signals that you should align as much as you can to inform Google where is your relevant country market. So when someone asks me this question looking to move their site towards a local hosting or a service that can provide the IP of the country where the site is targeting what I ask is that if it also makes sense to make the move for other reasons (better speed for your visitors in that country, better support, etc.) and from a cost perspective, then they should do it. Nonetheless, if it's a painful migration that will cost too much and they still have other factors to optimize, then I would likely recommend to see first the impact of these other aspects that won't cost that much to implement. In your case is different, since from what you say is about "moving away" your server from the relevant country, then in this situation is about saving or earning in other aspects that are not SEO since you will be taking away a signal that you were already providing. So instead of this what I would do first is to try to find another service that can keep providing you the UK IP and that also meets your other requirements, trying to keep the signal you were already giving while achieving the goals you're looking to have by changing your hosting service. In the case this is impossible, then yes, you would likely have to assume you will be losing a bit of the "geotargeted" relevance you were already giving, and in dependence how you're optimizing the other relevant factors towards this, it might have a stronger or weaker effect. Thanks!

    | Aleyda
    0

  • Hi, Indeed the stage site is still out there .. http://screencast.com/t/ZbmvsYE7njYv Again, that's easy to fix  - if you want to - as you just need to verify the site in Web master tools and then use the removal tool. For the main site - your visibility is low now -> http://screencast.com/t/PQzdxTih Can you identify in this graph when the stage site was pushed into the index ? (in order to see if that was the reason for any drops in visibility / rankings) Anyway - the main site was not very stabile even before .. the link profile is not strong enough to set it on a stabile course.

    | eyepaq
    0

  • Hi Christopher, Google stated in February of this year that they plan to make duplicate listing management easier in future, showing all versions of a listing in the dashboard and asking if they are all of the same business, but this has yet to materialize unfortunately. It can't happen soon enough, in my opinion! Right now, one way to deal with this is to go to the duplicate Google+ Local pages in question and click the 'report a problem' link on the right hand side and go through the troubleshooter. You will find a radio button for 'Place is a duplicate of another place'. That would be the one to choose. I would try that first and give it a few weeks to see if your request for removal works. If not, the second thing I would do is go to the My Listing Has Incorrect Information Troubleshooter: http://support.google.com/places/bin/static.py?hl=en&ts=1386120&page=ts.cs Again, select the radio button that refers to duplicates and walk through the wizard. It will ask you if you already tried reporting through the 'Report A Problem' link. This troubleshooter may actually lead you to a phone call, if you are lucky. Be prepared for this process to take time, Christopher, and keep crossing your fingers that Google will eventually implement easier dashboard management for this incredibly common issue.

    | MiriamEllis
    0

  • Yes, that's my understanding as well. I guess my concern was directing all SEO Equity (through 301s and canonical tags) to a page that can't be found through the natural navigation of the website. Which I understand is something Google pays attention to when crawling a site. Thanks

    | Bucktown
    0

  • of course, it depends how much other content you have and are creating. There is also the question of WHAT are you trying to rank on. The keyword stuffing penalty generally hits when the keyword stuffing is excessive and uncalled for. You will probably not rank for "dining room" in the top 10 if the rest of your content is automatically generated. As a very general rule: do not use the keyword on a page more that 25 times and do not exceed 10% keyword density. BUT it may vary from case to case, from keyword to keyword, from market to market.

    | Stramark
    0

  • Hi Ricarda, We have numerous sites hosted on DNN and have been able to implement both canonical tags in the header and 301 redirects fairly easily. I'd recommend your developer consults DNN for support if the are having difficulty.

    | MichaelYork
    0

  • I can't imagine a modern eCommerce platform creating a new product page URL every time you added more stock for an existing product. Perhaps this is being done incorrectly? Does it have something to do with how the SKU or product ID is entered, or some other user error? If not, I'd suggest changing platforms. In the meantime I guess rel canonicaling to one of them would work, but what happens when someone buys from that page/product ID? Is there only ever 1 in stock and, if so, does the URL go away/404/redirect when that happens? I think rel canonical would be a band-aid in this case. Let us know how it works out. Which eCommerce platform is it?

    | Everett
    0

  • Thanks Mike. I think part of my problem is that being a CMS a lot of the code is automatically generated.   Therefore the code in the header of the blog archives is the same as the code in the header for a blog post.  Therefore I guess I'm stuck unless I can find some kind of SEO plug in for Umbraco.  It was set up by another web company so I'm not very familiar with it. If anyone has experience with this situation in Umbraco I'd love some advice. Regards Ian

    | iragless
    0

  • Hi JD, First things first, the site is dancing around in the SERPs so you may be ranking well and it is trying to gain traction, or you may not be ranking well at all and you are getting inflated spikes. Secondly use the Google Adwords keyword tool to search the exact search traffic for the terms you are optimising and let me know what they are. Be sure not to tick "broad" or "phrase" as these will give far larger figures, just make sure only "exact" is ticked. You could optimise as well as you like, but if there is no traffic for the search terms, then that is why you aren't getting any traffic. When you are searching for your rankings, use a different browser, maybe IE and ensure you aren't signed in to any Google account and your browsing history is cleared so you aren't getting any bias towards your own website. Get back to me on the above and we can go from there.

    | MichaelYork
    1

  • Here you go: http://www.mcanerin.com/en/articles/301-redirect-iis.asp Enjoy!

    | Mike_Davis
    0