This is all about an aspect of programming in C. If you don't understand and don't care about it, you can skip this piece entirely without loss.
I have reluctantly come to the conclusion that NULL should never be used. It does not do what the common uses of it want, but it appears to be designed to, and, worse, it works often enough to lead much incorrect code to be written using it.
The apparent intended use of NULL is as a generic nil pointer value, and this is its commonest use. (In an attempt to reduce confusion, I use "nil pointer" in this document for what the C standard calls a "null pointer".)
As a generic nil pointer, NULL is correct in exactly those places where 0 is equally correct (code that uses NULL cannot be correct on implementations that define NULL as 0 unless 0 is correct, and 0 is specifically permitted as a definition for NULL). However, NULL is often used in places where insufficient context is available to provide a pointer type, such as a variable argument in the actual argument list to a varargs function. In these places, 0 is not correct (because it's not the correct pointer type), so NULL cannot be either. (The only exceptions are code that is not intended to be portable to implementations that choose the "integer constant expression with value zero" option for their definition of NULL. Such cases are very rare, especially since a new release of a given OS or compiler may change the definition of NULL—and, even then, the remarks below about the "such an expression cast to type pointer-to-void" option apply.) Of course, NULL is correct when explicitly cast to the appropriate pointer type—but so is 0.
However, the use of NULL in places without pointer context works often enough to lead code authors to think that it's OK. Specifically, it works when (a) NULL is defined using the integer-expression option, integers are the same size as the relevant pointer type, integer 0 has the same representation as a nil pointer of the relevant type, and (if it's in an argument list) integers and the relevant pointer type use sufficiently similar parameter passing mechanisms (eg, the same place on the stack, or the same register); or (b) NULL is defined using the cast-to-void-* option, a void * nil pointer has the same size and representation (and parameter passing mechanism, if applicable) as a nil pointer of the relevant type.
Most current machines fit the second description; most where integers and pointers are the same size also fit the first—and the ones where they are different usually use the void * option to avoid "breaking code" (more accurately, to avoid pointing up certain classes of pre-existing brokenness in code). Thus, coders rarely get burnt by misusing NULL as a generic nil pointer—until it's time to port to a machine that doesn't fit the above, and much hair-pulling ensues (often involving finger-pointing by code authors who do not understand why their code is breaking and who thus incorrectly blame the new system).
To confuse the issue even further, "null character" is a common term for the zero value used as a terminator in C strings, a terminology reinforced by the "NUL" name for ASCII codepoint 0. This leads people to use NULL when they want the "string terminator" semantic, which is never correct, since NULL might be of a non-integer type, and there is no guarantee that converting a nil pointer back to an integer will give zero. (Again, code not intended to be portable to implementations that define NULL with the void * option might be an exception, but, again, 0 is at least as correct as NULL is.)
Thus, everywhere NULL is correct, 0 is at least as correct, and does not perpetuate the confusion that leads to misuse of NULL. The only value in NULL is that (when used correctly) it carries implications to humans that the code's author thought of the value as a nil pointer. However, since it does not carry any information as to what type of nil pointer it's supposed to be, this is of limited value; worse, since NULL is so often misused in other ways, even this implication is discouragingly weak. In my opinion, the confusion resulting from using NULL far outweighs the implicit-comment value of using it—it is better to write 0, and, when commentary for human readers is called for, use a comment.