The best free one i've found, uses paypal, is Easy PayPal Custom Fields.
Really easy to use and implement, and no hassles for handovers to clients who don't have to much web expertise and need something really easy to use.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
The best free one i've found, uses paypal, is Easy PayPal Custom Fields.
Really easy to use and implement, and no hassles for handovers to clients who don't have to much web expertise and need something really easy to use.
Thats correct in most cases:
It works likes this: a robot wants to vists a Web site URL, say http://www.example.com/welcome.html. Before it does so, it firsts checks for http://www.example.com/robots.txt, and finds:
User-agent: *
Disallow: /
The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
Robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
More information available here about:
Hey there,
I had to deal with something similar for Au and NZ in the past as well. Difference was that we shared a .com url and had ?site=au and ?site=nz queries. Nightmare! Au (for similar reasons you're looking at overcoming) was cannibalising NZ results. We then changed it all to seperate .com.au and .co.nz domains which ultimately turned them into two sites independant of each other.
This is kind of what you'll be doing. They will be seen as two individual sites which need individual attention to gain search ground. Problem we faced was that our market in Au was larger than NZ and as such had and got more 'internet love' (back links). Au also had the benefit of being around for a long time so it's domain authority was a lot more than the newly formed NZ url.
You could interlink between the two and give the new NZ site a bit of link juice from the established AU kudos but it sounds like you're gonna have to do a bit of leg work to get the new site similar attention the older and established AU site's been getting.
Most importantly would be to ensure that the old NZ domain is 301'd and canonical tagged to the new site. Update and add the new sitemaps. Perhaps media releases and blog/forum posts with the new url to promote it and get that url out into the world.
The other aspect is that it's a localised site (AU and NZ), so your best bet for success for NZ results is to get NZ-centric links. Last thing you want is for your NZ and AU results to get mixed up in the search engines due to an overwhelming amount of AU backlinks to and NZ site. Sorry to be the bearer of kinda bad news, but if my experience (albeit early on) is anything to go by you got to put in the hard yards to get the new NZ site some links and build its NZ kudos.
Good luck mate.
Hey all,
I understand the functionality of PDF files being indexed and how to remove them if required so in this post I'm not requiring any advice on 'how to' as such, but i just wanted to get a general opinion/consensus of if you deliberately allow PDF files to be crawled/indexed.
Whether or not you guys optimise the files for search.
If you do disallow them from being crawled and indexed, why?
Generally the pro's and con's you may have found about have searchable PDF files as part of your indexed content.
I guess you gotta ask if there any benefit of that page/content being crawled or indexed?
The view cart pages I would've thought would be completely dynamic and built/based on individual user interaction and have not much (if any) relevant content that would benefit your site. I'd guess that blocking this page is pretty standard for shopping carts.
Not too sure if this helps, but i found this site that put together a handy little list of recommended excludes for a few popular CMS's.
All search engines technically 'read' titles that are longer than 70 but will only display 70 or less. Anything longer is displayed with an ellipsis ( ... )
As the title is the first thing that users will see, having content in it longer is of no benefit (if you cant read it, whats the point?)
Theres a good write up available here about it:
http://www.seomoz.org/learn-seo/title-tag
(If there is a search engine that does display over 70, it's likely not much of your audience is using it)
A subdomain site is treated as a different website than that of your root, but having a large amount of duplicate content (being a forum, I am assuming it does) can't be great. Is there any reason to keep the two operating at the same time (and not just 301'ing it)?
The old site would still be getting crawled and indexed so from an SEO perspective i guess that's a bad factor for your root domain as it'd be a direct competitor against your proper domain content.
Your best bet would be to either cannonical the content on the old domain/posts and/or 301 the old stuff to the new stuff as soon as possible. From both a user point of view (2 sites isconfusing) and a duplicate content point of view. I guess the main issue would be that you now have yourself as a competitor for content that should be getting found from your main domain.
PHP includes are serverside so from a visitor and search engine perspective will render fine to be indexed as if it was straight up HTML.
no, not to large, it was the third most popular page behind the homepage. Whole page size is around 3mb.
As John mentioned page rank will be minimal and new domains won't have much domain rank either (likely a lot less than your main established domain) if you're wanting them for organic search purposes.
Unless you're wanting to keep building upon the domain and establish them as more than 'one off' landing pages (link building etc) then using them over your main domain wont do you much good short term or long.
I noticed this a little while ago too, it lasted for a little over a week but eventually showed up again though. I looked around for feed back on it to see why and came across a couple of instances (a little more elaborate than what I was experiencing albeit), but this post talks about it a little:
http://www.ghacks.net/2011/05/07/google-search-new-layout-style/
It's possible you or your results might be part of a test case for a new google release. Were you logged in? Maybe try from a different IP see if the results are the same...
Dunno, if that helped at all, but just something that might put your mind at rest a little.
Not entirely sure how this system works, but i would think that if the express-lane content had search/content benefits then adding it as a folder would be a better option.
As subdomains are treated as seperate websites, if the content can benefit your root/main domain, better off giving to kudos to that rather than to a standalone subdomain site.
Great. Thanks guys. I guess it was more a question of 'do you think there is a limit to outbound links'.
EG: If you're a computer company, then getting a link to your site from apple.com would be gold. But if every computer website in North America got a link from the apple site, would that devalue either a) The link value or b) apple's domain value.
I know thats an extreme example, but thought it might shed some light on the 'how many links it too many' question.
Thanks again.
As Steven mentioned, you gotta be careful when doing this. As there is often a very long 'leash' as far as content on forums, i've heard of instances where adding a forum to the main domain has harmed popular SERPs for keywords and domain authority. Especially since Panda.
Perhaps check your numbers for your current forum first. See if you can notice anything that may be harming or improved with the current forum site and fix those before you migrate in.
There was a conversation about this earlier in the week that might help you out a little more:
Good luck
So I was wondering if there was a general consensis regarding the amount of outbound links you have coming from your domain to other domains and if there is a correlation between the more you have going out, the more (for want of a better word) damage you can do to your domain.
EG: is a site with good domain authority better to keep the short leash on handing out links to external sites? Does this make the link juice from the site more valuable?
As Ryan said, rel="author" would help.
Another thing which might help is an app called Tynt. It adds a little snippet of code for whenever your content is copied from your site, then attributes it to your domain by attaching a link when that copy is pasted. It's not fool proof of course and the url can be removed, but I like to think of it as one more 'safe guard' to attributing content to your site (plus it's good for link building).
One bit of advice I found useful is if you do link to other content on your site in your posts, to use absolute paths to your own domain, this way if your content is just scraped you at least get the direct link to your site as well and added kudos as being relevant to the topic even if the post is picked up by someone else.
Of course, none of this is fool proof, but it's just a couple of added aspects that might help your site benefit a little if your content is stolen.
Wouldn't a 302 be a better option? Temporary vs permanent.
Odd. . .
I thought it might have something to do with spamming as these are heavily spammed terms (same results don't show up for Viagra either). But then i found this post, which a google employee is asking for examples. Sounds like a problem more than a 'feature'.
http://www.google.com/support/forum/p/AdWords/thread?tid=4930bd33634b1755&hl=en
Recently had the same issue and we went with not indexing them. Two issues mainly - the duplicate content issue (same results being attributed to author, posts, categories etc - which i don't believe is too much of a problem. The other one though was the general titles being attribute to the results were pretty average.
We tested both indexing of and excluding and didnt see any increase in traffic with them, so went with the exclude option.
Hope it helps. Good luck.
Hey,
It's a little old, but Rand did a video about this that might help.
http://www.seomoz.org/blog/whiteboard-friday-domain-trust-authority
Ultimately it comes down to how important your site is seen as been. If no one is talking about you, then you're perceived as not that relevant. Links are conversations, so the best probable metric to increase your score.