Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Google webmaster tool doestn allow me to send 'URL and all linked pages"
ndirseo Since you "made a lot of optimization changes" I am going to assume you used fetch as Google a lot as well. If so, you will only have the option to Send URL and all linked pages 10 times per month. You can fetch as Google 500 times. I think the issue here is simply one of bandwidth and overuse. Think of it this way.Fetch as Google was designed to let you see the pages as Google sees them for the purpose of errors, those who had been spammed, etc. As more and more people began using it, they had to limit due to overuse. If you have checked a page with Fetch as Google and it comes back ok, and you do that for the whole site, you should not have to use url and all linked pages that often as the pages are linked. Hope this helps. I Have pointed to where you see the limits in GWMT with image. Here is the link re webmaster tools. Note its next to last paragraph. Hope that helps, Robert D7fxWzN.png?1
| RobertFisher0 -
Writing of url query strings to be seo frinedly
It's not too hard to do in .net. Since .net 3.5, Routing has been added to webforms and is native to MVC. Basically, this will allow you to "rewrite", thus create friendly url such as www.123.com/florida/miami/city-detail/ You can map your url so your pages understand wich section is city, state etc... so you can query your database. Here is an article about routing in .net: http://weblogs.asp.net/scottgu/archive/2009/10/13/url-routing-with-asp-net-4-web-forms-vs-2010-and-net-4-0-series.aspx Note: Keep the url in lower case and replace spaces with a dash this way (Cap Cod > cap-cod) In our case, this page uses routing actually, I left the .aspx cause I don't care much about it, but this way our entire catalog is written in 2 pages and it pulls the cat and subcats from the url itself: http://www.smartresolution.com/printing/envelopes/10-envelopes.aspx Here: envelopes = category and 10-envelopes=subcategory. The .aspx can be dropped too if you are picky All you have to do in your application is find a pattern and you should able to handle the rewrite in no time and with little code. Note, don't forget to redirect (301) the old pages to the new url as well
| smarties9540 -
What happens to a page if I take it down then later put it back up?
I agree with the 302 solution. Makes complete sense for usability as well as for SEO. You could also 302 it to another, more appropriate page during the time it would be down and then remove the redirect when you want it make it available again.
| NakulGoyal0 -
Seo and ssl error (Error code: sec_error_revoked_certificate)
so the answer is definitely "the certificate needs to be fixed as soon as possible because every day it is not, nobody can get to the site. That will then cause the SEO to fall apart. The longer it's down, the longer it will take for search engines to reset once they devalue the site."
| AlanBleiweiss0 -
Schema / microdata under Google listing
Google recommend: http://schema.org/docs/schemas.html
| mahakian0 -
Canonicalization Issue | E-commerce
I am analyzing of noindexing them because they are webpages that only show the price of that particular combination of products. I cannot change the platform for ecommerce because it would require a heavy investment both in time and capital. I will try to write some content in order to differentiate them enough. Although i dont exactly know how much of a content is considered a duplicate. Do you know a guideline for this topic? Regards, Thanks!
| JesusD0 -
Page feedback
Only thing I think really needs adding would be image meta. Those pictures look blank to me when browsing as Googlebot so the real googlebot probably has no idea what's there.
| MikeRoberts0 -
Can hotlinking images from multiple sites be bad for SEO?
Sorry, hotlinking was the wrong word to use, we're actually just embedding the images. Is it possible that Google recognises that spammy sites (as an example) tend to embed lots of images and therefore use it as an indicator of spam? Also, is poor netiquette ever taken into account? Again, maybe because Google is trying to find spammy sites? For the record, it is something we'll be fixing (especially from a copyright point of view), but we're trying to prioritise this. If there's a potential SEO impact, we'll sort it quick, if not, then we'll do more pressing things first.
| OptiBacUK0 -
International Seo - Canada
Hi there! Ideally if you're going to be targeting another country the best way to go is with a ccTLD, in this case, .ca. In case you're not able then you can go with a directory and at some point switch to the ccTLD (when you're ready) with a migration, when it makes sense for your business. You can take a look at the different criteria you can take into consideration in this post I wrote some weeks ago, where I also gave my recommendations from a web structure perspective when you target many countries with different languages for each with a generic domain. In this case it would be the following one to maintain a consistency from a country/language perspective and keep the URL structure as simple as possible for a user and crawling perspective: USA: yourbrand.com Canada in English: yourbrand.com/en-ca/ Canada in French: yourbrand.com/fr-ca/ In the post I mention you will find also some examples with this different structures and other best practices to follow. Good luck with the site Aleyda
| Aleyda0 -
Finding Broken Back Links
I would second the Screaming Frog tool, just used that yesterday and it's great! Might even buy a licence for it now. The free versions great though.
| JonathanRolande0 -
Meta Description: How to Implement It?
All In One SEO Pack is another good free plugin that allows you to directly edit the meta description tag on each page
| madegood1 -
Sudden drop in website ranking
Takeshi is right that newer websites can fluctuate at the beginning. You need to make sure that you are using the right type of link building techniques. Go for quality over quantity, find relevant websites that match your industry (so technology in your case) and strike up a relationship with them. Go for branded links, keep an eye out for the percentage of brand links against anchor text links, too many anchor text links and Google will penalise you.
| KarlBantleman0 -
Free Joomla SEO Extension?
I have Joomla and K2 installed on 5 sites. The best solution I've found, and it isn't free, is mijoSEF (http://extensions.joomla.org/extensions/site-management/sef/22026). Using K2 will require two of their extensions, which is a pain, but I've found it worth the investment. Their tutorials are well done and complete - something rare for extensions in my opinion. I'm not an expert and haven't used it's full functionality, but it's saved me, and my team, a lot of time with meta tags, 404's, and duplicate content. Best, Doug
| DougHoltOnline0 -
Blocked URL's by robots.txt
I added the 2 lines several hours because i saw that Google had crawl some zend routes. I changed the entire content of the website 2 weeks ago and during this process i notice the problem. Many thanks Mark for your help.
| meralucian370 -
Importance of correction of technical errors
Thanks for the excellent response! Exactly what i wanted to hear! Regards!
| JesusD0