Ah, time travel to the beginning of the Epoch.
For us non-Tweeters, can you explain what we’re seeing, and what we should be seeing instead?
I think it’s the date?
Dude, funny sex number! Clearly not Elmo’s doing though, he’d have made the date April 20, 1969.
Computers often track time as the number of seconds that have passed since the strike of midnight on Jan 1 1970. So if the time value is missing or invalid it will show as 11:59:59 on 1969 (aka 0 seconds, since 1 is midnight).
This has happened to Twitter before:
I’ve seen that date timestamped on many computer files over the years, any time that info gets lost or corrupted. You’re right that it’s just an industry standard.
De facto industry standard, anyway. It’s often just easier to use the time system built into your system, which for a lot of Linux will be Unix time, as Pleonast linked above.
It will potentially lead to some issues in 2038. It may be just over 14 years away but we all know by now that we keep legacy systems running looooong after we thought they’d be replaced or updated
The shortsightedness of both date and time encoding standards is truly remarkable. One might feel the same way about the IPv4 address space, but to be fair, it was practically inconceivable in 1981, before there even was an internet, that more than 232 node addresses would ever be needed. But the progress of time is, like, kinda predictable, except apparently to neckbeard geeks.
In defense of neckbeard geeks, they presumably didn’t think Unix (and its time standard) would take over the world and hence be so hard to update safely. If they had implemented the standard a little closer to 2000, they presumably would have had a better feel for just how long business applications are kept running, and come up with a more clever plan.
Yeah…to the extent things weren’t predictable, the neckbeard geeks predicted nobody would keep legacy systems designed to run for the immediate future to continue to be used for decades and decades. They overestimated both inertia and company’s appetites for infrastructure investment
But the same geeks (or at least, culturally related geeks) made the same mistake with dates and precipitated the Y2K crisis. The majority of the most important applications were for mainframes, and it was obvious at the time that these applications were very expensive to develop and would likely remain in use for a very long time. What were they thinking when they decided to use two digits for the year – that they’d be retired by the time year 00 rolled around, so somebody else’s problem?
It should have been clear from IBM’s strong focus on compatibility in the System/360 line and its successors that the mainframe software they were writing would likely be around for a long time.
CFOs generally don’t know shit about this stuff but companies should have had a CTO or CIO who did. But most of those guys are jargon-spouting nitwits.
Well, no, it wasn’t “obvious” they’d be used for that long. It was ‘obvious’ to them that it was shortsighted to do so but what is obvious to the programmers is not always obvious to CFOs and vice-versa.
We’ve run into versions of that at the last 3 companies I’ve worked at - using software packages that needed to be updated for years but weren’t for budgetary reasons (well, it’s not ‘critical’ just yet). And then suddenly having to pay the price when they inevitably weren’t up to snuff.
What was obvious was that computer hardware was much more expensive than programmers, then, that paradigm switched.
Save your bytes! Make your code as tight as possible!
From Anthropology to Computer Science. Where we going next?
I vote we get someone with a trans-dimensional teleporter to swap out our Elon Musk for Elon Tusk.
Feel free to dump our version in whichever hellhole dimension is handy as you’re completing the swap.
(Just bringing the thread home to where the hate is)
Elmo: buy a Tesla and you won’t be murdered by Hamas!
I would almost be willing to bet that there are some burned out teslas at that festival.
Whatever the condition is that makes him tweet such a thing is worsening. He’s despicable.
Cut Elmo some slack, he’s just being equal opportunity in trying to turn a profit off of human tragedy. Follow his favorite antisemite sources of disinformation to increase engagement on his social media platform but remember to buy his car if you don’t want to be murdered!
I expect Elmo to start handing out free Teslas to Hamas to make sure that such a thing never happens again.
I think in most (if not all) of these cases that isn’t what happened. You can’t make anything infinite. It doesn’t work that way. You can’t set up a system that can accommodate literally any number or any unlimited data container. That’s physically and conceptually impossible. There has to be a limit of some kind.
So either you set it to be as big as you possibly can given the limitations of technology at the time (such as setting it to the highest limit of whatever bit depth you have) or choose some limit that is so large that you can’t conceive of that limit being reached any time soon. In either case, you rely on the fact that technology will advance and that a bigger solution can and will be developed before it becomes a problem.
And, that’s what is happening. We have IPv6 now. We can make systems that can handle dates going centuries into the future. Yes, it’s a pain to change everything to accommodate those new systems, but that’s how progress works.
Bill Gates is quoted as saying that no system would ever need more than 640KB of memory. That quote is not accurate; on the contrary, he pushed for a larger limit but that limit was the result of processor design. You always have to work within some kind of effective limit, and you have to trust that the limit will work until that limit can be bypassed and a new system can be developed. That’s how the IT industry is.