Another major area that I recall was health machinery. A lot of monitors had times and dates built into their systems, and there was a concern that the monitors might stop working properly. You wouldn’t want heart-monitors or automatic drug dispensers to start going wonky when hooked up to a patient.
(This probably led to the Y2K-compliant marking for the x-ray bulb mentioned by @beowulff; once you started wanting to be sure of Y2K compliance, everything had to be marked with it during the transition, even if the particular item didn’t have any code involved.)
I guess as I was in the middle of it for 2 different companies I didn’t notice the over-stated part as I had a pretty damn good understanding of the actual issues.
It was a major problem that would have caused a lot of major supply chain issues and as we recently found out in Pandemic, major supply chain issues do cause a lot of grief.
But what was the doom and gloom? Was it that silliness about planes falling out of the sky? I didn’t think people took that seriously.
Industry did the work and did it well. We not only fixed our problems but we answered surveys from over 70% of our buyers to verify we were addressing the issue and solving it. This was common. Very common in manufacturing.
Our main ERP would have failed badly but we were still under Maintenance and I got us up to date with the new ISO date format version and then fixed all our custom code to match up. Massive testing for a very small shop with people from every department involved in the testing.
We also checked on PCs to see which ones were going to fail. It was only a few but one would have shutdown QC if not discovered. That was in many ways the biggest pain as I had little control over that system and needed to get a small outside shop to fix their shit code and migrate everything over to a new for the time PC.
Very much so and more. Often it wasn’t that they were afraid, they often didn’t have the source code. Sometimes software was purchased from companies that had gone out of business and nobody had the source code. And many companies had to make sure they weren’t going to be liable for the Y2K problems on all the software they had no hand in creating and had no control over.
Even when they did have the source code it’s not as simple as looking for calls to a date/timestamp function. How that date or time was used after it was retrieved, stored, and then used again later was incredibly difficult to track. It took an enormous number of man hours in some cases.
All that work resulted in companies getting out the check book to pay for the upgraded hardware and software they had long needed. Unfortunately some of us were making good money supporting those obsolete systems, it was Y2K every day for some of those companies with every new problem that arose. C’est la vie.
There is a leap year every four years. Unless the year is divisible by 100. So there was no leap year in 1700, 1800, or 1900. BUT, if the year is divisible by 400, it is a leap year. So 2000 was a leap year.
One morning in early 2000, I got work done on my car. When I wrote the check, I looked at my watch, and entered the date it displayed: March 1st.
When I got in to work and logged on, I learned that the date was actually February 29. Because it was a leap year. I called the mechanic and told them I’d unintentionally post-dated the check by one day. They said not to worry about it.
Whoever wrote the code for my ten-dollar watch didn’t know their calendar rules.
Yeah, for the systems I was working on, analysing the code would have taken forever. We had to spin up a complete replica of the system on separate infrastructure, set the date and time to new years eve, then run it for a couple of months, complete with test inputs and processes to see what would happen, which would then direct the investigation into the code. This isn’t the best approach because it still didn’t cover what happens next year-end, and also such testing often idealises the scenarios, omitting real world type of collisions of circumstance, weird inputs from users etc.
I don’t seriously claim Y2K, or any other date cliff, should be ignored until there are failures.
But I will mention that when I came into my federal office, for post-installation review, on the morning of 1/1/2000, I tried the credit union ATM in the lobby, and it was down.
By the time I left that afternoon, it worked.
Planned down time, or quick and dirty last minute fix? I hope credit unions aren’t a realm of the quick and dirty, but can only guess.
A lot of places intentionally went down that morning until they were sure that things were ok. The place I worked told us not to use our PCs until we got the all clear and it might be days. We were back to regular work by 10am.
In all seriousness, we responded very well to an up coming crisis and did the work and the testing and as everything was done well people think it was a nothing-burger.
Same fucking dumb as shit thinking that makes slowing or stopping climate change so hard.
There definitely was overhype involved. Look at all of the people stocking up on bottled water beforehand, for instance: Even if absolutely nothing had been done to fix the problems, municipal water supplies would have continued to work just fine. It’s possible that the water company’s billing would have been screwed up by the bug, but if that had happened, someone at the water company, or at the local government body that oversees the water company, would have just ordered them to not turn off anyone’s service until the bug was fixed. It could have been a real headache sorting out who owed what afterwards, but meanwhile the water would have continued to flow.
Financial systems would have been badly impacted. Navigation systems could have been, too, so I wouldn’t have wanted to be flying anywhere at the time. But not everything would have been affected, like the doomsayers were predicting.
Water purification plants are heavily automated. If scheduling of things like chlorination go off, the water can go bad with bacterial infections, like e coli.
When I started my career in technology consulting in the mid 90s the firm I joined set all new hires to a 4 week COBOL boot camp. I’m so glad I never had to work on that shit!
A lot of the work the firm did was related to Y2K.
In the late 90s, a lot of companies were starting to get into client-server technology like Visual Basic or PowerBuilder and later web technology. So for many companies, I think a lot of them were like “we have to do all Y2K work on legacy systems, we might as well replace them with some new tech.”
HA! That would not be the case if it were reversed if the computer error showed you owed the bank 100 of interest. You would be sent to collections, house foreclosed on, car repoed, etc.
I was an army reservist and I spent New Year’s Eve 1999 on a base, just in case civilization collapsed, though I’ve no idea what exactly what we were supposed to do about it. I remember watching Paris on a then-exotic big-screen TV celebrate the New Year with a cool fireworks display off the Eiffel Tower, but it overall turned out to be a waste of time.
At the time, even the government was recommending people prepare for a 3-5 day interruption of services. Which is a good idea for everyone in any situation - having a few days of water & food available in case of an unforeseen emergency.
Now the preppers who had bunkers full of MREs - they were nuts.
Based upon the anecdotal commentary in this thread, every institution, every company, every government, took enough steps to avoid the predicted catastrophes, as I don’t recall any catastrophic events occuring in January of 2000.
But there were some events that occured.
US spy satellites stopped working for a few days
A New Yorker was told he owed over $90k in late fees for a video rental
Alarms in a Japanese nuclear plant started going off just after midnight
The US naval observatory reported the incorrect time in the early hours of 1/1/00
A baby born in Denmark was registered as being 100 years old
Again no catastrophic events. I hardly doubt every company and or organization did what the posters above describe that they did, and the resulting negative consequences were not catastrophic. Therefore, I highly doubt there would have been any catastropic events even if nobody had done anything.
Can anyone that worked on their projects describe how someone would have been killed if they hadn’t made the corrections for Y2K at their respective companies?
Bank errors, customer nuisances, accounting problems, even power outages, are all correctable.
My neighbor at the time, had about 50 cases of bottled water, 4 generators, about 10 gallons of fuel for the generators, and had emptied out their bank accounts and were keeping the cash in a lock box in their bedroom. During the 1st week of January 2000 he asked me if I wanted to buy one of the generators because Home Depot wasn’t taking returns.
In a very simplistic explanation, if everyone had ignored Y2K, we would have had some bad supply chain snarls.
This would probably of led to some deaths as some vital medical equipment would have run short in some areas.
Mostly though Y2K would have caused economic issues.
Please note how many odd seeming shortages we had with COVID. Modern society relies on a huge number of logistical decisions that try to minimize stock in the warehouse. Little problems can add up quickly to big problems.