This is not strictly a C# question. I am thinking of building a genealogy database and it occured to me that the date ranges I could be dealing with may be outside the 1753+ date range handled by msSQL.
Rather than get myself stuck in a corner, I want to deal with it now. I can think of a couple of ways of dealing with this, but the best way I can think of is to create my own date object and just store it in the database as a 4 byte integer as minutes from zero. That should give me a range of about +/-4000 years and nobody should complain of only having accuracy to a minute in a genealogy database.
Here is the clincher. Does anyone know any source for math to determine date? Calculating the days from zero (ignoring leap seconds) is simple enough, but with months ranging from 28-31 days, I am not entirely sure about the math.
Can anyone suggest anything?
Rather than get myself stuck in a corner, I want to deal with it now. I can think of a couple of ways of dealing with this, but the best way I can think of is to create my own date object and just store it in the database as a 4 byte integer as minutes from zero. That should give me a range of about +/-4000 years and nobody should complain of only having accuracy to a minute in a genealogy database.
Here is the clincher. Does anyone know any source for math to determine date? Calculating the days from zero (ignoring leap seconds) is simple enough, but with months ranging from 28-31 days, I am not entirely sure about the math.
Can anyone suggest anything?