Y1.8K.

I’m not sure it will be so easy. At my last employer, we were running on 64-bit systems and theoretically should have been safe, but our own code wasn’t written very well and frequently used “int” where they should have used time_t. The tricky thing is that the code appears to work, and will continue to work for another 20 years or so before it suddenly explodes.

I wonder how much code like that is out there. Even professional programmers can be surprisingly lazy and or ignorant about such things.

Yeah…
But, dumping the code into a modern compiler with strict type-checking turned on should find those types of errors.

Assuming the source code still exists.

If you Google “January 1 19100” you find plenty of reports of such problems. I remember seeing quite a bit of this on the web at the time, but as you say it was usually non-critical.

That’s not exactly correct. Herman Hollerith invented his tabulating machines in the 1880s. If years were encoded on the cards as 2 digits then they would have had the same problems.

I knew someone would want to contradict me :slight_smile: but you may have glossed over my subtle point…I referenced '94 without mentioning the century, heh, heh. How do you know I didn’t mean 1794?

If computers had been around in 1900, the problem probably would have been February 29 appearing when it isn’t a leap year.

Actually, there was a problem with two-digit dates even back as far as 1900.

In cemetaries.

People would get headstones in advance, and since it was cheaper to engrave it at one time, people would put 1845-18__ as the inscription. Some lived past 1900, so things had to be redone.

The same thing occurred in 2000, though I think the funeral directors were more award of the possibility.

Yeah, no. Not a chance. Have you ever actually turned on those checks on a serious program? Implicit integer conversions are endemic in C/C++. Eliminating them all in a real-world program simply isn’t plausible. If you were to try to use the checks to find time_t conversion errors you’d drown in the noise.

What will probably happen is that compilers or static type checkers will have a special case for warning specifically about converting time_t to other integer types.

Well, I don’t know how serious, but yes…

I recently ported some embedded code (tens of thousands of lines) from an 8-bit system to a POSIX 64-bit system.
And, there were hundreds of type errors, and I fixed them all.

And, it took a long time.

Yes, people were told this back in the 70’s. The trouble is they often had to write software so that it would work with legacy data. Records from last year were still needed, and they used 2 digit dates. So the new system had to understand 2 digit dates. Correcting the problem would involve a lot of extra work, and wouldn’t matter for another 25 years, so why bother? Three updates later, they still have to use 2 digit dates.

Fair enough. I never actually *worked *as a programmer, so I don’t know what the professional practice was. I’ll stand corrected.

That will only find conversions and comparisons involving time_t. Code that only uses ints wouldn’t be caught, and no doubt there’s some out there.

Around 1980, I reviewed some date-handling code that a colleague had written. His leap year check correctly said that century years are not leap years but didn’t account for the 400-year exception, so it would have said that 2000 was not a leap year. I corrected this, but we both had a chuckle about the idea of our software still being used 20 years from now. Software changes so fast that (some) programmers (certainly not me!), when faced with the choice of doing something simple which may break in 20 or 30 years or doing it correctly but in a more complex way, may think “This code is not still going to be running 20 years from now, so it’s not worth worrying about problems that will only occur in the far future; furthermore, if the code is still in use, someone will be maintaining it and they can deal with the problem.”

ETA: As it happens, that company went out of business well before 2000.

https://www-users.cs.york.ac.uk/susan/joke/decly.htm

At my company in the 80’s we were thinking about date formats.

I bet a lot of software has been and is still written using the simple “divisible by 4 = leap year” rule. This works between 1901 and 2099 but will fail in 2100. Beware the Y2100 catastrophe!

In the 1980s I worked for a bank which saved the year as an 8-bit offset from 1900, which could handle dates up to 2026. They also resolved the issue by going out of business, due to the savings and loan crises.

This reminds me of the Doper with a vehicle clock that’s been doing wonky things for a while. (Maybe from a leap day goof.)

Can’t find it apparently without knowing more, like a username.

Anyone? Bueller? Bueller?

https%3A%2F%2Fboards%2Estraightdope%2Ecom%2Fsdmb%2Fshowthread%2Ephp%3Ft%3D643968

A 64 bit counter only pushes the problem back, though. Somewhere around the year 3,000,000,000,000, we will just run into the same problem all over again.

We need to start thinking long term.

Yep, thanks. That’s it.

Fixed link.

A Feb. 28+ like error. But looks like it got reset and may not be tracked anymore.

Anyway, an example of how various date errors still crop up. Didn’t Apple have a problem a couple years ago bricking some devices due to a date overflow in an OS upgrade?