One of the places where computer programming and Western history can't help but intersect is on the question of the Julian vs. Gregorian calendars. For me, that hit the radar in the Java programming language. But the PHP language two-ups (as opposed to one-ups) the J/C distinction by also including the Jewish calendar, and--I am so not making this up-- the French Revolutionary calendar.
It turns out that, for the programmer (including those who write languages for other programmers), date and time are thornier issues than they, on the surface, appear. At the most basic computing level a date and time is a number. On modern UNIX-based systems, if that number is zero, it's precisely midnight (for Greenwich, UK) on January 1st, 1970. Negative values fall before that date, positive ones after that. Each number is a count of milliseconds (one thousand milliseconds = one second) from the start of The Me Decade.
Arbitrary--and perhaps arcane--as it is, this is what the writers of computer languages have to build on. In other words, that "primitive" number must be translated (transliterated?) into more recognizable notions of year, month, day, hour, minute, second and millisecond. And, as cited above, the question of whose definition of year/month/day/etc. is an added layer of complexity. And that's even without the amenity APIs--provided by many languages--like the ability to extract the day of the week as well as day of the month from a given date. Or providing a menu of options for formatting years, months and days as naked numbers vs. padded ones--e.g. the difference between 1/1/70 and 01/01/1970. (As if merely keeping track of leap years isn't complex enough...)
Of course, with all the handy things modern APIs can do with basic milliseconds, two things they can't do is tell you how many anyone has to work with, nor how they should be put to best use. Which, upon reflection, is a good thing. After all, computers--no matter how powerful--should never be allowed to aspire to fortune-telling or philosophy. That would be a disaster second only to cats spontaneously evolving thumbs.
Thoughts on computers, companies, and the equally puzzling humans who interact with them