Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO Issues

Discuss site health, structure, and other technical SEO issues.


  • G'day Jesse, I have added the following code to my .htaccess file ... RewriteEngine On RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.index.html\ HTTP/ RewriteRule ^(.)index.html$ http://www.just-insulation.com/$1 [R=301,L] All attempts to add a similar code to redirect in the other direction results in the home page not loading, or the creation of an infinite loop. Open Site Explorer still returns ... www.just-insulation.com a PA of 32 and  www.just-insulation.com/index.html a PA of 15 The thought of all this lost link juice is making me really thirsty.

    | JustInsulation
    0

  • okay I was able to pull your site code so you know it works on the newest version of Web Kit which is based on Safari only for me. You have some large coding issues that are most likely preventing you from being indexed properly. Your page structure is not allowing Google to read anything. Google can only read 16 words from your home page this is making Google bot think there is nothing there and is not a useful webpage. You also still have an issue with the forwarding to a #welcome the two links below will show you what Google bot is able to see when indexing your website. http://www.feedthebot.com/tools/spider/test.php http://totheweb.com/tools/spider-test/index.php this shows the problem get your header and your structure theirs a lot of junk and no real words for anyone to read. you can also use http://pro.moz.com/tools/crawl-test to check your site through Moz another superb tool to use is screaming frog http://www.screamingfrog.co.uk/seo-spider/ the reason I believe this is happening is you are using Ajax preventing Google from reading your site correctly. Here is a link talking about how you can fix this. The reason I believe this is because of the hash tag "AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment." http://webmasters.stackexchange.com/questions/32882/why-is-google-still-not-indexing-my-website http://validator.w3.org/check?uri=http%3A%2F%2Fwww.memovalley.com%2F&charset=%28detect+automatically%29&doctype=Inline&ss=1&outline=1&group=0&No200=1&st=1&user-agent=W3C_Validator%2F1.3+http%3A%2F%2Fvalidator.w3.org%2Fservices you may want to consider using a boilerplate I would build the site out of wordpress as its architecture is superb and you would not have these issues. http://moz.com/ugc/8-reasons-why-your-site-might-not-get-indexed http://architects.dzone.com/articles/best-html5-page-structure http://html5doctor.com/html5-seo-search-engine-optimisation/ I would recommend using a trusted developer I can give you a recommendation. I would use this gentleman he is very knowledgeable and a master of HTML 5 http://www.gregreindel.com/ here are some suggestions on HTML improvements as far as HTML improvements go you are using HTML 5 http://architects.dzone.com/articles/best-html5-page-structure this is a useful tool if you are doing your own developing or checking your website. http://www.feedthebot.com here is more information on HTML5 http://diveinto.html5doctor.com/ https://developer.mozilla.org/en-US/docs/Web/HTML/Sections_and_Outlines_of_an_HTML5_document I would personally use a developer to fix this. I hope this helps, Thomas

    | BlueprintMarketing
    0

  • 3 of my pages have missing titles issues, missing meta description issues Did you do any updates on your site that could have led to the meta tags getting deleted from the pages. Identify the pages missing the meta tags and fix them asap Duplicate Content Issues You can use copyscape to find out who might be duplicating your content. The Moz tools can assist you in checking if the content is duplicated internally on any of the pages.

    | SEO5Team
    0

  • Hello Pikka, I searched for that page on Google and see the review rich snippet in the SERPs. Perhaps you are thinking this is going to change your ranking. It is not. It will change the way the search engine result looks, which will encourage a better click-through-rate, which "could" indirectly improve rankings, but technically speaking at this point in time there is no direct ranking benefit from having a review/rating rich snippet in the SERPs. They look good in the search results: See here. Be happy!

    | Everett
    0

  • Disabling the feature will keep the trackback link from being posted on your site, but it wouldn't remove existing links from other sites. However, it may discourage people who are linking to you just to get the trackback link. This could be good or bad, depending on how you look at it. Personally, I don't' want or need any links coming from someone who is doing it just for the trackback. Most of them are nofollowed anyway.

    | Everett
    0

  • Typically if Google is choosing to show a snippet of content instead of your meta description then there is something they don't like about your meta description. For instance, it could be too short, too long, over-optimized, not formatted correctly, etc... You can't force Google to use your meta description, but you can play around with rewriting meta tags to see if they end up liking one enough to use when someone searches for your primary keywords on that page. Also use the No ODP tag if you aren't already.

    | Everett
    0

  • Sawais dee kap I'm a big fan of Thailand too....great scuba diving, great people, awesome food. Honestly I wouldn't sweat your domain name.  In fact, despite all the talk about lessening the importance of EMD in ranking, I'm still seeing TONS of sites get massive boosts in ranking purely because of exact and partial match domains. I would not expect to see a domain name like yours draw any sort of ranking penalty. By the way...nice site, lots of great, useful content (I presume it's original?).  Email me privately at michael@visualitineraries.com if you're interested in doing a guest post on my site about Thailand.

    | MichaelC-15022
    0

  • Hi James,  I assume that Google indexing your sitemap isn't the ultimate goal. Rather, Google indexing your individual web pages and ultimately, your pages showing up in the top of the search results is the ultimate goal, is that correct? Your sitemap is only a suggestion to google that you have pages you would like it to index. Google may or may not crawl pages just because they're listed on a sitemap.  Roughly, its the authority of pages to be indexed that determines frequency of google's crawl--and that even includes the sitemap.  The greater the authority the more often they're going to be crawled. The last time your sitemap was revised, it may have been later in the crawling cycle and this past time, it may have been earlier in the cycle (or you domain could have lost authority).  Ultimately, social networking, link building, and building authority is going to be what speeds things up for you.  In the mean time, be patient, you're on Google time, now.

    | Chris.Menke
    0

  • I would select a few key publications and send them a press release - make sure it's actually newsworthy though. Although many people go online for things, plenty of people still read papers and they still have influence. If you have only sent out a small number of releases, the chances of running into duplicate content issues are not that high - they don't all put their content on the web. Often, they will also re-write the info you supply so that it matches their own style, so it won't be the words you wrote anyway (apart from quotes which will not be changed). If the content isn't on your own site, it is even less of a problem in terms of duplicate content. A newspaper article can help with brand awareness and even if readers don't type in a URL to go to your website, they may search for you online instead because they remember your brand. I wouldn't discount papers quite yet. Try it and if it works, keep going. If the time turns out to be not worth it, invest efforts elsewhere. You should be able to find out the newsdesk email addresses by going to the website of the publication you are interested in.

    | Houses
    1

  • Thanks a lot for sharing Travis. This is really helpful! Appreciate your help here.

    | AB_Newbie
    0

  • You're welcome, Clint. If the listing is new, that could certainly be a contributing factor. In the meantime, the client should be doing all he can to strengthen his website and citations and begin getting reviews. All of these will contribute to his hoped-for climb up the ranks. Good luck!

    | MiriamEllis
    0

  • Hello Matt, Two things: #1 - "The amount of pagerank that is lost through a redirect is currently the same as the amount of pagrank that dissapates through a link". So they are exactly the same. Source: http://www.youtube.com/watch?v=Filv4pP-1nw It was thought that more pagerank was lost through a redirect because Google does not want webmasters relying on redirects instead of actually updating internal links. At least that's what Matt told us at a search conference many years before denying it in this video. So.... That's sort of a non-issue - for now. Until Matt changes his mind about how he wants to explain it again. #2 - It is still "best practice" to update your internal links after a site migration instead of relying on the 301 redirects. The reason is that you don't just migrate once. Think long-term and you will see there is always going to be a possibility of changing the URLs again at some point in the future. If you do not update your internal links you're going to be sending users through multiple redirect hops, which will THEN probably lose more page rank than what is lost in a normal links because it will be happening multiple times.

    | Everett
    0

  • If you use a MySQL database to store your website data, I think that to do this kind of automatic "archival" work by creating an automatic PHP script would take between 2 to 5 hours work. I don't see why it should take more than that. If someone tells you that it is going to take more than that, I would be suspicious. Either the programmer is not good enough, or wants to cheat on you. That unfortunately happens more than you think!! Be sure to ask for a step-by-step description of how they plan to complete the job. If you have doubts, please feel free to ask me, I am a pretty expert PHP programmer. I don't work for others, but just for myself (I built and keep tweaking my own websites virtualsheetmusic.com, musicianspage.com and others with very little help from external programmers). Good luck!

    | fablau
    0

  • Thanks for the answer Oleg this is helpful

    | tbps
    0

  • Hello Bastian, With regard to outdated products the best solution I have found is to 301 redirect the old product URL to the closest category page. I prefer this to redirecting to another product page because once the new product is outdated you'll be going through multiple redirects. Look into updating the old blog posts and making them more relevant for readers today. If you are unable to do that I would just leave them where they are. As they get outdated they should drop in the rankings anyway. Or you could 301 redirect them to a more recent post.

    | Everett
    0

  • Hi Yiannis, As far as I'm aware, there isn't really a way to "block" a link.  The link is seen on the other site.  Returning a 404 for the page being linked to doesn't change the fact that there are a 100K links from one site pointing at your site.  The only options I'm aware of are to 1.) contact the owner of the website with the links and ask them to remove the links and 2.) if that doesn't work disavow the links. I understand your hesitancy to use the disavow tool, but quite frankly, this is exactly what it is intended for. If you feel comfortably with the links being there and think Google has already dealt with them, then do nothing, but if you want to do something about the links, you either have to get them removed or disavow them. BTW - My understanding of the partial manual actions is that often times Google not only deals with the suspicious links (devaluing them), but they also penalize the pages/keywords they think you were attempting to manipulate.  So, just because it was a partial action and not a full site action doesn't mean it's not effecting some of your rankings.  It's just not going to affect all your rankings for all your pages. Kurt Steinbrueck OurChurch.Com

    | Kurt_Steinbrueck
    0

  • I've had to do this for our dev sites before - you'll be fine. Like you said, add the dev site in WMT, do a remove site by leaving the field blank, and proceed. It won't take your main site out of the index.

    | Kingof5
    0

  • I don't think they buy google ads for most of them (the spammers)

    | sbrault74
    0

  • You can also use things like AWS S3 bucket or rack space cloud files as the hostname therefore making it easier to get your website closer to whatever country you are targeting. So everyone knows I believe Akamai has over 200 points of presence in the world that's quite a few and they're available through rack space cloud files.

    | BlueprintMarketing
    0