Questions
-
Canonical URL availability
Thanks Dirk for your great in-depth response! I will now check with developers what the estimated effort would be. Making the canonical URL available will let me sleep better at night before releasing the new site version. I think the risk shouldn't be huge if we cannot do this and will not waste too many ressources on this (unless, of course, we see a negative impact, which I will then report here;) Best, Phil
Intermediate & Advanced SEO | | zeepartner0 -
Robots.txt on http vs. https
Glad to be of help. Check out this Google link to confirm you picked up the 180 day crawl https://support.google.com/webmasters/answer/83106?hl=en Second URLs helpful as well. http://blog.raventools.com/moving-site-from-http-to-ssl/ all the best, tom
Technical SEO Issues | | BlueprintMarketing0 -
Google indexing despite robots.txt block
It sounds like Martijn solved your problem, but I still wanted to add that robots.txt exclusions keep search bots from reading pages that are disallowed, but it does not stop those pages from being returned in search results. When those pages do appear, a lot of times they'll have a page description along the lines of "A description of this page is not available due to this sites robots.txt". If you want to ensure that pages are kept out of search engines results, you have to use the noindex meta tag on each page.
Technical SEO Issues | | john4math0 -
Block Domain in robots.txt
Hi Philipp, I have not heard of Google going rogue like this before, however I have seen it with other search engines (Baidu). I would first verify that the robots.txt is configured correctly, and verify there is no links anywhere to the domain. The reason I mentioned this prior, was due to this official notification on Google: https://support.google.com/webmasters/answer/156449?rd=1 While Google won't crawl or index the content of pages blocked by robots.txt, we may still index the URLs if we find them on other pages on the web. As a result, the URL of the page and, potentially, other publicly available information such as anchor text in links to the site, or the title from the Open Directory Project (www.dmoz.org), can appear in Google search results. My next thought would be, did Google start crawling the site before the robots.txt blocked them from doing so? This may have caused Google to start the indexing process which is not instantaneous, then you have the new urls appear after the robots.txt went into effect. The solution is add the meta tag noindex, or block put an explicit block on the server as I mention above. If you are worried about duplicate content issues you maybe able to at least canonical the subdomain urls to the correct url. Hope that helps and good luck
Technical SEO Issues | | donford0 -
405 HTTP Status instead of 404
Hmm, this might actually be a non-issue. I used http://tools.seobook.com/server-header-checker/, but if I check it with Firebug it correctly returns a 404...
Technical SEO Issues | | zeepartner0 -
AJAX & JQuery Tabs: Indexation & Navigation
hello Philip, robots.txt file allows you to tell the web bots that crawl your site what is a link and what is not a link that you want to show to the world and to Google Most search engines will analyze and follow a link only if it contains three query string parameters or fewer. many parameters in the link shown you have 5 parameters they are what come after the1st / as shown below you have 5. You can block off certain parameters with robots.txt /en/residential/help/loesung/entfernen-sie-sim-lock http://www.swisscom.ch/en/residential/help/loesung/entfernen-sie-sim-lock.htm for some reason whenever I go to your link you have posted I get this error The requested URL /system/sling/cqform/defaultlogin.html was not found on this server. http://msdn.microsoft.com/en-us/library/ff723936(v=expression.40).aspx http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687 please links above regarding URL parameters and Microsoft and Google agree that too many parameters and they will not search the link. I hope this is been helpful sincerely, Thomas
Web Design | | BlueprintMarketing0 -
Escape commas in OSE csv export
Phillip, Thanks for writing in! Just so I could see the problem that you are looking at, could you let me know the reports that you are looking at that you are seeing this issue If you could let me know which report you downloaded, I could see if I could replicate this issue! Looking forward in hearing from you. Peter SEOmoz Help Team.
Moz Tools | | Peterli1 -
Http & https canonicalization issues
cool, thank for your backup on this. I figured that the bot-redirect described would be a bit over the top. And great to know that having a http-homepage on an otherwise https domain is a non-issue. much appreciated!
Technical SEO Issues | | zeepartner0 -
View all Page for Product Overview Pages
Hi Ben Thanks for your thoughts. Makes sense to use rel=next/previous. Im just not certain if I should also indicate the view-all page. We do have one, but I think it's a better user experience to display just the first 30 products (since that page loads faster).
On-Page / Site Optimization | | zeepartner0 -
Event Landing Pages not ranking
Hi Cyrus Thanks for your thoughts on this. You actually confirm my own suspicions... Indeed what we decided to do next is to get rid of the AJAX loads (since AJAX indexation would give our devs too much of a headache...). And we'll optimize the links to the event landing pages themselves by putting an href around the event's title as well. I'll give a heads up when the changes have been done and as soon as I see any effect - be it positive or negative..
Technical SEO Issues | | zeepartner0 -
Google Places: Multiple Entries
Use Google Mapmaker to see what Google already has indexed as far as business name, etc. But you should only have one listing. If you have multiple addresses, then you can have more then one listing but you must have a unique phone number for each listing.
Vertical SEO: Video, Image, Local | | StrategicEdgePartners0 -
Hidden Content with "clip"
..although I should add that it is indeed normal to have the possibility to skip content for visually impaired users (an accessibility organisation confirmed this). I'm just not sure what the best practice to do it is and I seriously don't like reducing content to 1 pixel.
Intermediate & Advanced SEO | | zeepartner0