Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Are robots.txt wildcards still valid? If so, what is the proper syntax for setting this up?
Great job. I just wanted to add this from Google Webmasters http://googlewebmastercentral.blogspot.com/2008/06/improving-on-robots-exclusion-protocol.html and this from Google Developers https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt
| DarinPirkey0 -
No Google cached snapshot image... 'Text-only version' working.
I have the same issue. Only my text version of Google cache shows something for voyage
| vincentgagne1 -
Is this normal on my website speed tool
hi sri, what is the reason why you think i should use text only ads
| ClaireH-1848860 -
Undo a 301 or Starting New Domain?
Fab, thanks for taking the time to reply that really helps going forward. Much appreciated.
| Nobody15609869897230 -
Should We Index These Category Pages?
If you can, implement a "snippet" mentality where you write unique content in at the top of those categories, and then provide the links and a short excerpt under each link below (in the case of blog posts) This often helps eliminate problematic duplicate content issues. I would recommend at least 100 words of pure unique content for those category pages above the blog posts to start.
| toddmumford0 -
Duplicate Title Tags
Hi Richard I'm really happy I could be of help to you let me know if there's anything else you need sincerely, Thomas
| BlueprintMarketing0 -
Excessive Links - How to fix
I checked your homepage you have 680 links found on http://www.printe-z.com/I used this it's always worked very well http://www.feedthebot.com/tools/linkcount/result.php?url=http%3A%2F%2Fwww.printe-z.com%2FYou need a web Dev try the guy I listed he knows his stuffThis is the tool with out your URL for you to testhttp://www.feedthebot.com/tools/linkcount/
| BlueprintMarketing0 -
Empty Google cached pages.
I guess that could be possible since there is a text only version that is cached. Has anyone else ever heard of this? Changed hosting and now speed isn't an issue anymore but the problem is still there.
| vincentgagne0 -
Block Domain in robots.txt
Hi Philipp, I have not heard of Google going rogue like this before, however I have seen it with other search engines (Baidu). I would first verify that the robots.txt is configured correctly, and verify there is no links anywhere to the domain. The reason I mentioned this prior, was due to this official notification on Google: https://support.google.com/webmasters/answer/156449?rd=1 While Google won't crawl or index the content of pages blocked by robots.txt, we may still index the URLs if we find them on other pages on the web. As a result, the URL of the page and, potentially, other publicly available information such as anchor text in links to the site, or the title from the Open Directory Project (www.dmoz.org), can appear in Google search results. My next thought would be, did Google start crawling the site before the robots.txt blocked them from doing so? This may have caused Google to start the indexing process which is not instantaneous, then you have the new urls appear after the robots.txt went into effect. The solution is add the meta tag noindex, or block put an explicit block on the server as I mention above. If you are worried about duplicate content issues you maybe able to at least canonical the subdomain urls to the correct url. Hope that helps and good luck
| donford0 -
Problem indexing web developed with Ruby on Rails
Hi Eduardo, For the titles this is probably due to google rewriting page titles based on brand searches. They have been experimenting with various ways of displaying titles in the serps for branded searches and if you are searching for 'jobsandtalent' with no spaces then this is a pretty specific search and google is rewriting you title based on it. If you search for your whole page title + brand you will see the normal title as expected. It does not have anything to do with Ruby on Rails. As for the page rank, this is not a number I place much importance in. I cant remember off hand how often it is updated but it is not all the time. More to the point to be looking a moz domain and page metrics if you ask me. That being said I see your pr as 5 for the root domain www.jobandtalent.oom. I noticed you seem to be using cookie based redirects from the main domain to the language folder so that if you have entered /es once then going to the .com main page automatically pushes you to .com/es. This can potentially be problematic in terms of google properly indexing you site. I cannot say if this is responsible for your difficulties in rankings but in a competitive sector like job postings I would certainly look changing that so that google (and users) can view all pages of the site in whichever language they choose without being pushed into a language based on cookies. Hope that helps!
| LynnPatchett0 -
Duplicate content or titles
Hi there, Mike and Bradley have given you solid advice on how to correct your first issue. Would you mind sharing a few more details about the nature of your second issue by posting it as a new, separate question? Thanks so much!
| Christy-Correll0 -
Marketing URL
I don't understand the point of the external domain, either - I tend to agree with you that sub-folders would be perfectly fine here. Unfortunately, there are two issues at play: (1) You've got to direct the short, marketing URLs to the main site URLs. That's a relatively easy - you could either use rewrites or redirects (rewrites are probably better for humans, in this case). Google can't generally index URLs that are on printed materials, TV ads, billboards, etc. (unless people start linking to them or promoting them on social), so your SEO risks are pretty minimal. The safest bet would be a straight 301-redirect. (2) The other issue is the long/ugly URLs on the site in general. If you want to move the entire site to the short URLs and then use those short URLs for marketing, that's great, but then you'll need to 301-redirect the entire current site. There are definitely risks to that approach, and I'm not sure it's necessary here. The benefits really depend on the scope of the current site and whether the URL structure is creating problems, like spinning out duplicate pages. You could also potentially use rel=canonical to solve some problems, but again, changing your site-wide URLs involves risk. I wrote more on (2) here: http://moz.com/blog/should-i-change-my-urls-for-seo
| Dr-Pete0 -
How a change in IP address affects SEO?
I have nothing new to add but wanted to say that both Chris and Adam's answers are spot on from my experience, thumbs up to both of them.
| donford0 -
Is there a tool to find a strange link
cheers for that. it is strange that i am looking for links on the home page which there are around 140 or so and then google is pulling links from the different sections and showing them on the home page. will see if i can find the solution to try and reduce the link many thanks for all your help
| ClaireH-1848860 -
No Local & Global Search Volumes Next to Competition%
Hi Fries, Search volumes are now available as we are using Bings Adwords API. Unfortunately Belgium is not one of the countries supported by the API yet, however Google Adwords can pull data for you. Here is a list of supported countries for Bings API, hopefully they will be able to add more soon. <colgroup><col width="140"></colgroup> | Argentina | | Austria | | Canada | | Canada | | Chile | | Colombia | | Denmark | | Finland | | France | | Germany | | India | | Ireland | | Italy | | Mexico | | Netherland | | Norway | | Peru | | Singapore | | Spain | | Sweden | | Switzerland | | Venezuela | | United Kingdom | | United States |
| DavidLee0 -
403 forbidden error how to solve them
Hi Tim, Glad it helped. It might be worth asking your host what kind of features they have for preventing flooding attacks, there are various ways of addressing them on the server side that most hosts will have enabled in one way or another. Unless you have a specific issue with these kind of attacks, it seems to me that this part of the module is causing more harm than good as it is now.
| LynnPatchett0 -
How to test a geo tagged homepage?
Unfortunately it seems that the e-commerce (which is closed sourced) is built around this geo IP location system. When I checked the cached version of the site it was indeed the American version, which was just the default simple version of the homepage (no keyword text) I then google searched stings of texts from our UK / Irish homepages, and no results found So I then created an American version of the homepage (just a dup of the uk/irish homepages) A week later did my search test, and got a hit. Now we are starting to rank for a few more keywords on the homepage
| PaddyDisplays0 -
Best Way to Break Down Paginated Content?
I think you are forgetting that engines are capable of running Javascript just fine, and all the content that is brought via AJAX to be viewable to the user will also be indexed by search engines. I would certainly go with option 4, it's a standard Today, but have a look at the "pushState", that and address manipulation, that way, your users will be able to access an exact review (say in page 3) by just typing the address: http://www.mysite.com/blue-widget-review-page3 and have by default the page 3 loaded. If you go this route, you can also put a hidden (css'ed) NEXT PAGE button at the end to link to the next page. Hope that helps!
| FedeEinhorn0 -
Does "?" in my URL have a negative effect?
The question mark sign signals the beggining of URL parameters. This will not impact your keyword however if the URL parameters do not significantly alter the content of you page then this could cause duplicate content issues as Search engines will see each parameter variation as a unique URL. Search engines will try to identify these but to help them out you should Discount URL parameters in Webmaster tools for the respective search engine (Not just Google), and to avoid link equity being lost or dilluted implementing the Rel=cannonical tag to the page and referenceing back to a sensible root page version will mean all your inbound links work together. Hope this helps.
| Sarbs0