> One big thing is that NULL should never "mean" anything in a business sense. It is the absence of a value, hence it cannot mean anything
I always had a problem with that notion. I mean, it has a memory representation, it has a set of operators you can apply to it, defined behavior in UNIQUE and FOREIGN KEY constraints etc. All this is well documented (though can behave slightly differently between databases, as you mentioned).
So, it has a set of valid values (just one: NULL) and a set of valid operations, so it's a type!
And now you have a type that looks somewhat similar to a null in "normal" programing languages, and SQL generally lacks the mechanisms for inventing your own types, so why wouldn't you use it in your business logic where it makes sense?
The design of NULL seems like a historical accident anyway. The best I can discern, there was a need for something to behave as "excluded" in the context of FOREIGN KEYs and outer joins, and so that semantic was just passed along to other areas where it made less sense.
I think a better type system and a better separation between comparison logic and the core type would have obviated most of the NULL's weirdness and made it far less foot-gunny in the process...
I always had a problem with that notion. I mean, it has a memory representation, it has a set of operators you can apply to it, defined behavior in UNIQUE and FOREIGN KEY constraints etc. All this is well documented (though can behave slightly differently between databases, as you mentioned).
So, it has a set of valid values (just one: NULL) and a set of valid operations, so it's a type!
And now you have a type that looks somewhat similar to a null in "normal" programing languages, and SQL generally lacks the mechanisms for inventing your own types, so why wouldn't you use it in your business logic where it makes sense?
The design of NULL seems like a historical accident anyway. The best I can discern, there was a need for something to behave as "excluded" in the context of FOREIGN KEYs and outer joins, and so that semantic was just passed along to other areas where it made less sense.
I think a better type system and a better separation between comparison logic and the core type would have obviated most of the NULL's weirdness and made it far less foot-gunny in the process...