Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Having a massive amount of duplicate crawl errors
There isn't necessarily anything wrong with your login, it looks like it's carrying a referring parameter in the URL so that it knows where to send you next. Not the most sophisticated way to do it, but very common. Since it returns a 404 the easiest solution would be to make your login link/button "nofollow". You could add this to the html or javascript fairly easily. Or you could place a directive in your robots.txt file disallowing these urls, Something like: User-agent: * Disallow: *login should do the trick. (but test this) If this is a javascript code causing this, you might try wrapping the script in CDATA tags. This might also do the trick. Hope this helps. Best of luck with your SEO!
| Cyrus-Shepard0 -
Pages not indexed by Google
Good advice from Andrea and Brent. To use multiple sitemaps, do something like this: The main sitemap points to the other sitemap files. You can have up to 50,000 URLs in those files. mine are gzipped This one is sitemap_index.xml <sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"><sitemap><loc>http://yourdomain.com/writermap.xml.gz</loc> <lastmod>2012-03-15</lastmod></sitemap> <sitemap><loc>http://yourdomain.com/mainmap.xml.gz</loc> <lastmod>2012-03-15</lastmod></sitemap> <sitemap><loc>http://yourdomain.com/201201.xml.gz</loc> <lastmod>2012-03-15</lastmod></sitemap></sitemapindex> <sitemap><loc>http://yourdomain.com/201202.xml.gz</loc> <lastmod>2012-03-15</lastmod></sitemap> Here is a tip: Google will index some of those pages and some it will not index. If you have 5,000 urls in one sitemap and they only index 4957 you probably can't work out which 43 URLs they didn't index, so if you make the numbers smaller, it can be easier to discover the pages they don't like. not easy, but easier
| loopyal0 -
Should I add my brand name to every page title
Ditto those who say use it. In addition, if you're a local retailer with some name recognition, you'll benefit from having your name back-branded in the title. Remember world famous brands were not always world famous.
| AWCthreads0 -
Grabbing Expired Domains
I buy most of my expired domains through GoDaddy Auctions. Once you win an auction, it typically takes 4-5 days before the domain name is transferred to your account. Also, I use www.registercompass.com to evaluate expired domain names.
| StreamlineMetrics0 -
How to handle a future company expansion?
Hi Net66, Thanks for the further information. So - like an SEO or design firm, your client will basically have to make efforts to get as far as she can with organic. If she was actually an SEO, Google doesn't show those in Local anyway, but if she is competing with local businesses with physical addresses, no matter what she does, she is unlikely to be able to outrank them, but hopefully the work you do with her will enable her to come up below them. Half a loaf is better than none! Good luck. Miriam
| MiriamEllis0 -
Duplicate Home Page content and title ... Fix with a 301?
Just a side note - your home-page TITLE is "Peruvian Soul | Peruvian Soul" - not that it's likely to look spammy to Google, but it just looks odd and isn't really helping anything. Might be an artifact of your CMS.
| Dr-Pete0 -
What can be the cause of my inner pages ranking higher than my home page?
In Google Webmasters I currently don't have any messages showing from Google
| deciph220 -
Replacing H1's with images
Hey, My advice would be to use tag if you really want to use this technique and put an alt tag for the image. The alt tag than it will serve as a H1 + the image will serve for your visitors. I hope that helped, Istvan
| Keszi0 -
Should we introduce subfolders into the URLs on a new site?
(PS. we are obviously going to thoroughly 301 the old to new URLs so we don't lose any linkjuice.)
| oneresult0 -
How easy is it to quantify the negative value of additional (possibly unnecessary) folders and subfolders in a URL?
How easy is it to quantify... This can not be quantified. Not by anyone outside of Google and probably not by anybody inside without a lot of assumptions and math. I am a math guy, but long ago I gave up on SEO quantification. Not worth the time. Just do your best. Working on the principle that shorter is better I would like to see the URL looking more like this: domain.com/letterhead_printing Yes, get those generic undescriptive words out of there. Make the URL short enough that it will be easily read in the SERPs. Don't clutter with BS. Use hypens instead of underscores domain.com/letterhead-printing/ Can anyone tell me how big a difference that would actually make? Nobody can tell you any thing more than a guess... My guess is: Very little for ranking. It could bring a few more clicks in the SERPs, but just a few. You have the right idea for doing this properly. The benefits are tiny tiny (especially in hard competition), but tiny here and tiny there adds up to something that might move you one position higher in a close fight.
| EGOL0 -
Geo-Redirection
GWT > site config > settings. geo target www.mysite.com.au to AU. As for www.mysite.com, leave it "Unlisted" under Geographic target. This will make the AU site be seen only in AU while the .com shows world wide.
| Francisco_Meza0 -
Is it a bad that my site has the same title and description for directory listings?
Thanks! I'll try and have them changed / removed. I'm thinking thats why Google is ranking me much worse all of a sudden. My site is 14 months old, and the directories do make up the bulk of my backlinks. Do you know if this is a huge problem?
| eugenecomputergeeks0 -
How to block "print" pages from indexing
Donnie, I agree. However, we had the same problem on a website and here's what we did the canonical tag: Over a period of 3-4 weeks, all those print pages disappeared from the SERP. Now if I take a print URL and do a cache: for that page, it shows me the web version of that page. So yes, I agree the question was about blocking the pages from getting indexed. There's no real recipe here, it's about getting the right solution. Before canonical tag, robots.txt was the only solution. But now with canonical there (provided one has the time and resources available to implement it vs adding one line of text to robots.txt), you can technically 301 the pages and not have to stop/restrict the spiders from crawling them. Absolutely no offence to your solution in any way. Both are indeed workable solutions. The best part is that your robots.txt solution takes 30 seconds to implement since you provided the actually disallow code :), so it's better.
| NakulGoyal0 -
Order of keyword usage in URL
Thanks Ryan. That helped a lot. I will not change the current page URL's but try to use the exact match keywords in the URL's in the future. best,
| Gamer070 -
Issues with Google Analytics since 3/15 @ 6:00AM ET
UPDATE: Statistics from 3/15 have miraculously updated. Still waiting for today to catch up, but we are more optimistic now. Joe
| Irishcentral1