vongrunt,
<...flaws in DB/application design (rarely used fields, low normalization,...>>
In the real world, however, the type of normalization that would be needed to eliminate nulls is simply not feasible. A NULL is ideal for a field that may apply to all records in a table, but may not. Now, techically if you were a Normalization fanatic, you'd have that field in a separate table and join it on the pk/fk.
So take the example (granted, not a real-world example but for academic purposes...) a table 'TruckDrivers' with the field 'TruckModelYear'. Now, some drivers may not have been issued a truck yet. What do you put there? What happens when you run a query that needs 'Average age of trucks assigned to drivers' ?
The 0's instead of nulls will pull the average down to the point where it's a useless query. If it's Null, those are ignored.
And granted--in my example you may well normalize, but there many times when it's just not feasible, but most of us live and work and develop in the real world. And the hard-line ef codd followers with the elbow patches who have never seen the outside of a classroom may not stop arguing the point until their tenure ends (meaning never), but that's the way it goes.
Now, you could add the criteria WHERE truckmodelyear > 0...BUT, what if the query needs other info besides that? ie, 'Give me a query that lists the number of all Drivers, grouped by 'DriverCategory' with the average truck age for the group'
It starts to get hairy with the sql needed to exclude the 0's from the average, but include those records in the count for the group--and that's just a simple example.
Having NULL's in these fields make it a no-brainer. And it's my contention that once one understands what null mean to the application and what nulls mean to the database engine, then they are no problem at all.
--jsteph