Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • Hi Peter many thanks I will bear this in mind, sound advice. David

    | David-E-Carey
    0

  • Thanks Adam

    | Adamshowbiz
    0

  • You should get an Advanced Google Analytics Consultant to figure out what is happening. I know if I were in this problem, I would be running a lot of Advanced Segments. Every business is different so there's no canned responses. Also the reason I say to hire someone for an analysis is if what you say is true and you don't believe the site has a penalty, then you can be losing traffic for many other reasons. You need to find out "why" conversions are lower.

    | Francisco_Meza
    0

  • Hello EGOL, thanks for the resources. I bookmarked both for later reference, especially the article written by Marie Haynes. I like to take a moment and update my findings. Earlier this morning, as I explored the subject site's hosting environment,  I found several backup copies of the site stored in "/home/alias". On a local host I restored the last backup and found that it did in fact undergo a restructuring of the URL's, not once but several times. Without looking further I suspect the site went through several (8) phases of restructuring and all within a period of 8 months making for linking and fluctuations in the SERP's. This leads to another question which I posted here on this forum.

    | UplinkSpyder
    0

  • Hi Thanks also as very much appreciated. The no 3 is just another url for a user to choose this page to add to a shortlist. Thanks again to you both.

    | AkilarOffice
    0

  • Thanks for all your advice!

    | madegood
    0

  • It all depends on server types.  Apache servers fetch the same exact page regardless of case, where as a IIS-Microsoft server considers the upper case and lower case pages to be different. You have an Apache server, so I wouldn't worry too much about it.  If you ever consider switching servers, though, you will want to take a look at your URL's.

    | WhoWuddaThunk
    0

  • Asking about the ideal configuration for a robots.txt file for WordPress is opening a huge can of worms There's plenty of discussion and disagreement about exactly what's best, but a lot of it depends on the actual configuration and goals of your own website. That's too long a discussion to get into here, but below is what I can recommend as a pretty basic, failsafe version that should work for most sites: Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/plugins/ Disallow: /wp-content/cache/ Disallow: /wp-content/themes/ Sitemap: http://www.yoursite.com/sitemap.xml I always prefer to explicitly declare the location of my site map, even if it's in the default location. There are other directives you can include, but they depend more on how you have handled other aspects of your website - e.g. trackbacks, comments and search results pages as well as feeds. This is where the list can get grey, as there are multiple ways to accomplish this, depending how your site is optimised, but here's a representative example. Disallow: /trackback/ Disallow: /feed/ Disallow: /comments/ Disallow: /category// Disallow: /trackback/ Disallow: /feed/ Disallow: /comments/ Disallow: /? Disallow: /? Sorry I can't be more specific on the above example, but it's where things really come down to how you're managing your specific site, and are a much bigger discussion. A web search for "best WordPress robots.txt file" will certainly show you the range of opinions on this. The key thing to remember with a robots.txt file is that it does not cause blocked URLs to be removed from the index, it only stops the crawlers from traversing those pages. It's designed to help the crawlers spend their time on the pages that you have declared useful, instead of wasting their time on pages that are more administrative in nature. A crawler has a limited amount of time to spend on your site, and you want it to spend that time looking at the valuable pages, not the backend. Paul

    | ThompsonPaul
    0

  • Hi there In reality, you only have two options: Rewrite the content on each localised page so it can pass as unique. Or deindex/canonicalise the localised pages. I'm afraid that's all there is to it.  If the client wants all of the pages to rank in Google, then they have to provide unique content for each page, or they run the risk of a duplicate content penalty.  There's no other way, I'm afraid. What might be more achievable is to set up a main page for the products that are offered in the regions, with links going to each region's page.  Those regional pages should be noindexed.  It means those pages won't rank in Google, but on the flip side it means you only need to concentrate on ranking one page. If all your efforts head there, the amount of links and social shares it accumulates could make it a powerful page for your keyword.

    | TomRayner
    0

  • Thanks Matt, That's cleared it up. I was always under the assumption that the publish date displayed in the serps was the date it was found and indexed. Thanks again, Greg

    | AndreVanKets
    0

  • Hello James, Why do these pages have "no SEO value"? Is it because they are AJAX pages or because you have them noindexed? Or both? To answer your original question, using an on-click javascript event to send a user to a page other than the URL listed in the href tag is borderline. It goes beyond the risk level I would feel comfortable with on an eCommerce site, but a lot of affiliate sites do this. For instance, all of their links out to merchant sites may go through a directory called /outlink/ so the href tag might look like .../outlink/link1234 and appear to send the user to another page on their domain, when actually the user gets redirected to the merchant's (e.g. Amazon.com, Best Buy...) website. Sometimes the user is redirected from the /outlink/... URL and sometimes they never even get that far because the javascript sends them to the merchant's URL first. It is not cloaking unless you are specifically treating Google differently. If Google doesn't understand your site that is their problem. If you have code that essentially says "IF Google, THEN do this. ELSE do that" it is your problem because you are cloaking. Make sense? There is a very distinct line there. The bottom line is if you want to show users a certain page then you should be showing that page to Google as well. If the problem is the content on that page doesn't appear for Google (e.g. AJAX) then you should look into optimizing that type of content to the best of your ability. For example, look into the use of hashbangs (#!) as in: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started

    | Everett
    0

  • Many thanks for the very swift reply - appreciate it.

    | newstd100
    0

  • If you can write a regular expression to redirect only the old URLs to a new page, such as the home page, or closest category page - without redirecting every mistyped URL - then I would go ahead and do that. However, you do not want to redirect every mistyped URL because that would create a "soft 404" situation. As Chris Menke mentioned below, if you have already redirected all of the "top pages" the rest, which probably have little or no external link authority, can just go to a 404 page, and will eventually be removed from the index. You will want to pay close attention to the 404 reports for several months afterwards just to make sure that you haven't missed any URLs with significant external traffic or links. If you really wanted to put in the extra effort you could redirect any page that gets more than 5 visits a day/month (you choose the threshold), and any that have at least one external link. That's what I would do, personally. Also, if the category name is in the old URL you could make the "catch all" redirects go to the most appropriate category instead of all going to the home page. It all depends on what you have to work with. If they are truly nonsensical you will probably end up just letting all of the URLs with no traffic or links 404. Good luck!

    | Everett
    0

  • Hi Alankar, yours is the classic case when the implementation of the annotation rel="alternate" hreflang="x-X" is strongly suggested. It serves the purpose of saying to Google what URL to show to the users depending on the language and/or location they performing the search from. In your case, for instance for the home pages of your sites, it should be implemented so: <rel="alternate" hreflang="en" href="http://www.geekwik.com"></rel="alternate"> <rel="alternate hreflang="en-IN" href="http://www.geekwik.in"></rel="alternate> I suggest you to check out these official resources about this topic: What's Hreflang? More details about Hreflang How to implement Hreflang in Sitemaps.xml You will surely find useful also this page: Official Google FAQ about International SEO

    | gfiorelli1
    0

  • First off make sure that you are using htaccess to redirect to either www or without www.  Be aware that EVERYTIME you update Drupal core that you may override this file and it revert back to the default file.  I make this mistake often. If you want to share your site I can take a look other than that the canonical should take care of the rest.

    | mikeusry
    0

  • Thank you, didn't know ( and ) are only valid for Google Search Appliance We're going to use javascript/styling to hide the cookie message as our code is in the html of the page.

    | Bio-RadAbs
    0

  • You're welcome

    | Robert_G
    0

  • thanks, Paul. I am testing the "Wordfence" plug-in, and I was able to identify (and delete) a malware file. I should probably look for a more secure host as well. Cheers Bo

    | vibelingo
    0

  • Hi Thomas, That's not a good idea. You should publish your client's business name identically across the whole web. So, if they are Trusty Tax Preparation, list them exactly that way with no additions or subtractions every single place you list them. If you are using Schema to markup the business, for instance, on the website, rely on the address fields to differentiate Trusty Tax Preparation in Miami from Trusty Tax Preparation in Los Angeles. Remember, local rankings are founded on the clarity of NAP (name-address-phone number) so be absolutely consistent everywhere you list NAP.

    | MiriamEllis
    0