Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • The title you use on a per page basis should be matched to that pages content. So if someone searches for "Online Billing Software" then this should target the page www.landingpage.com\subpage.html. However, there is no guarantee this is going to happen. It might be worthwhile having a quick read through the basics that SEOmoz have put together here as well: http://www.seomoz.org/beginners-guide-to-seo A great guide that will really give you some helpful tips. Andy

    | Andy.Drinkwater
    0

  • It's a site/server setting issue. Basically, web crawlers don't store cookies, and in order to keep a session alive between the client and server, a cookie is needed. So servers usually redirect the cookieless user to the same page, but will add some kind of unique id in the url in order to track the user and avoid seeing the same user as a new user each time it opens a new page. (in asp.net, this could create a new profile each page a crawler opens) In your case, it looks like cookieless user are redirect to a specific error page, thus warning users that they probably need to enable cookies.. In your site or server settings (ie web.config in asp.net) you can prevent the server to redirect cookieless users. You can also write a function that disable the cookieless redirect for robots only.

    | smarties954
    0

  • If you make a change to a page and later the search result in google reflect that change and your position has changed, you can assume that the change affected your position. However google continuously re-evaluates ranking. I have a personal theory: When you add content or make modifications to a page, I noticed in our case that google has a tendency to rank you a bit higher for a while, I think just to see if people click on the page. If people react positively by clicking and staying on the page, then google might assume that the page is relevant enough to stay there or higher (and vice versa).

    | smarties954
    0

  • I think there are 2 ways to do it. (Quick version) redirect all pages from your old domains to the new domain, however if you have links with specific keywords pointing to old posts, all the weight will be transfered to your new home page and most likely the content will be far different from the original post, and you might loose some positions you used to have on some posts. (Long version) scrape all the old articles, put them on your new server, then redirect each article independently from your old site to the new site. ie: redirect from www.myoldsite.com/article123 to www.mynewsite.com/article445 This way you keep all valuable links pointing to these post, but also the context of each link will match the content of the page like nothing happened (if you have such link of course). I am no expert in blogs, but I did the long version with our online store for each products and it worked like a charm. I kept the contextual links, pagerank and position for each one of them.

    | smarties954
    0

  • Howdy nyc-seo, This is a really good question with lots of implications. Although there's no single "right" answer, there are a few things you might want to consider: Subfolders are good for organizational purposes and as such can help structure your content. For example, SEOmoz puts all the blog content under seomoz.org/blog Subfolders can contain keywords that help with CTR and possibly with rankings. This may be good in certain situations, like in ecommerce. i.e. example.com/bird-feeders/hummingbirds That said, shorter domains tend to perform better in search results, and you want to avoid keyword stuffing in your URLs. Also, too many subfolders and you can run into some crawling issues. In these cases, it's best to keep your site architecture as "flat" as possible, without too many additional layers of sub-directories. Some additional resources that may help: http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites http://www.seomoz.org/blog/site-architecture-for-seo Hope this helps! Best of luck with your SEO.

    | Cyrus-Shepard
    0
  • This topic is deleted!

    0

  • As far as the upgrading of php on a server - this is for a different client, I seem to recall? I would have a real problem with a developer saying they weren't going to upgrade because it might break things. Of course it might break things, but there are industry-standard approaches to dealing with this For example, create a duplicate version of the site on a server instance that is using the newer version of php, and do a full Quality Assurance analysis on the dev site to find and fix anything that has issues with the new php version. Then deploy back to the live site with the php upgrade. This is standard operating procedure and is necessary because there will come a time when any older server software will no longer be supported and therefore becomes a security risk as it will be unpatched. Planning for these kinds of upgrades should be included in any website operational plan. Also, their solution to move WordPress to a subdomain is no protection whatsoever for the fact they have an extremely vulnerable, version. First, the site is just as vulnerable to being hacked again as it is still unpatched.  Being on a subdomain has no effect on this. Also, they have ruined the SEO value of that blog by moving it to a subdomain instead of fixing the issue and keeping it as a subdirectory of the prime site. And depending on the type of vulnerability exploited, it may still be possible for a hacker to get into the server via the vulnerable WP, then traverse from the subdomain to the prime site and cause harm there as well. In the short term, if there truly aren't resources to properly do QA (Quality Assurance) on a dev site running an updated version of PHP, the alternative would be to move the WordPress install to it's own server or VPS running a current version of PHP, upgrade it and security patch it, then use a reverse proxy setup to have it show up as blog.domain.com (or even move it back to domain,com/blog). This would at least allow for a properly secured WordPress that could also use current and new plugins. This would, however be at the expense of a slightly more complicated setup of the reverse proxy. Hope that answers your question? Paul

    | ThompsonPaul
    0
  • This topic is deleted!

    | KyleJB
    0

  • OK so you're only doing Jordan tours. And you're not doing other things that would require you to make sub folders, such as tours, hotels, flights. Do keyword research to make sure exact match "jordan tours" is your main target and not "tours in jorden" for instance. Also feel free to mix it up I would think that your C option is the best, since the keywords in the filename is better for SEO than in the directory www.yoursitewithoutkeywords.com/ www.yoursitewithoutkeywords.com/jordan-bike-tours www.yoursitewithoutkeywords.com/5-day-inclusive-tours-in-jorden

    | irvingw
    0
  • This topic is deleted!

    0

  • When looking at duplicate content, we look at the code on the page as well as the text. If the only thing different between two pages is a couple of words, we're going to flag that as duplicate content. The best suggestion is to write unique content that involves more than swapping out a few words.

    | KeriMorgret
    0

  • Thanks Tom, I do agree about the importance (or lack of it) of the domain/keyword issue, I was more curious as to why it wasn't being seen. But you've put my mind at rest.

    | lindsayjhopkins
    0

  • It used to always be suggested that directories were linked with a trailing slash.  The thinking there is that is indicates that the server should be looking for a directory not a file and make the look-up faster.  I've never been able to notice or measure any difference though to be honest. I would suggest just picking one and sticking to it. Also make sure that one of them redirects cleanly to the other. Which way around you decide to do it is probably less important.

    | matbennett
    0

  • Craig thanks for your help anyways .

    | tonyklu
    0

  • I would just 301 that URL to similar products that are as close as possible and or the home page of the site as this would be a better user experience than returning a 404. Although If some URLs on your site 404, this fact alone does not hurt you or count against you in Google’s search results. You could 410 them as 410's tend to drop out of the index faster.  If the page  has gotten a lot of traffic, then I might want to NOINDEX it. All in all there isn't a real one size fits all thing here, especially for eCommerce.

    | ColinWhite
    0

  • Hi Highland, I have the very same problem than Stefan. mydomain.com/product.html mydomain.com/product.html?___store=footdistrict_es The SEOMOZ software tells me this is considered duplicate content. I got to the Magento plugin website but it seems it is not compatible with Magento 1.7. Do you know a plugin that it is with this version? Stefan - can you please send me the links where you hve found the hacks? I am going to have a look to see if I can apply it. Thanks and regards

    | footd
    0

  • I don't believe there is any ranking benefit at all. The only benefits are as stated above the ability to corroborate spam signals and or verify ownership info. As a general rule I would make sure that all info anywhere on the web regarding you has consistency and an established NAP (Name,Address, Phone) format. My grandfather told me to always where a belt and suspenders if you want to be prepared for anything. And I apply that to my views on consistency.

    | ColinWhite
    0

  • Thanks Chris I checked the links to the UK Vogue page with both the internal link checker on web master tools and screaming frog.  There were less to the UK page. So that's pretty clear I need to improve the architecture of links here. Regarding the top 10 women's magazines area, what would be the most effective way of utlising that content?  I could do something as simple as a numbered list 1-10 with the mag names in a side bar, or I could do a seperate page, with a paragraph of content or 2 for each title.  This could take the form of a blog post. I guess I could do both? Thanks for your extremely helpful input. Paul

    | TheUniqueSEO
    0

  • If a user types "tripping.com" into google, Google is going to consider it a "URL Search" and almost certainly return your site first. You probably need more SEO help on the "Tripping", so in addition to giving you extra characters, that would tip the scale to not including the .com for SEO considerations. If you are investing in creating "Tripping.com" as your brand, and putting "Tripping" in the title tag without the .com would cause confusion, then I'd opt for the user experience over SEO consideration.

    | retailgeek
    0
  • This topic is deleted!

    0