I was working on a grid display and I had to properly sort date columns. The value provided was not a datetime, but instead a string like "20 Jan 2017" or "01 Feb 2020". Obviously sorting them alphabetically would not be very useful. So what I did was implement a custom sorting function that first parsed the strings as dates, then compared them. Easy enough, particularly since the Date object in Javascript has a Parse function that understands this format.

  The problem came with a string with the value "01 Jan 0001" which appeared randomly among the existing values. I first thought it was an error being thrown somewhere, or that it would not parse this string or even that it would be an overflow. It was none of that. Instead, it was about handling the year part.

  A little context first:

Date.parse('01 Jan 0001') //978300000000
new Date(0) //Thu Jan 01 1970 00:00:00

Date.parse('01 Jan 1950') //-631159200000
new Date(Date.parse('01 Jan 1950')) //Sun Jan 01 1950 00:00:00

Date.parse('31 Dec 49 23:59:59.999') //2524600799999
Date.parse('1 Jan 50 00:00:00.000') //-631159200000

new Date(Date.parse('01 Jan 0001')) //Mon Jan 01 2001 00:00:00

  The first two lines almost had me convinced Javascript does not handle dates lower than 1970. The next two lines disproved that and made me think it was a case of numerical overflow. The next two demonstrated it was not so. Now look closely at the last line. What? 2001?

  The problem was with the handling of years that are numerically smaller than 50. The parser assumes we used a two digit year and translates it into Date.parse('01 Jan 01') which would be 2001. We get a glimpse into how it works, too, because everything between 50 and 99 would be translated into 19xx and everything between 00 and 49 is considered 20xx.

  Note that .NET does not have this problem, correctly making the difference between a 2 digit and 4 digit year.

  Hope it helps people.

Comments

Be the first to post a comment

Post a comment