Similar with Y2K — it was only a nothingburger because it was taken seriously, and funded well. But the narrative is sometimes, “yeah lol it was a dud.”
The question is, what will happen in 2038 when y2k happens again due to an integer overflow? People are already sounding the alarm but who knows if people will fix all of the systems before it hits.
AfaIk that’s not entirely true, e.g. Debian is changing the system time from 32 bit integer to 64 bit. Thus I assume other distros do this as well. However, this does not help for industrial or IOT devices running deprecated Unix / Linux derivatives.
It’s already been addressed in Linux - not sure about other OSes. They doubled the size of time data so now you can keep using it until after the heat death of the universe. If you’re around then.
debian for example is atm at work recompiling everything vom 32bit to 64bit timestamps (thanks to open source this is no problem) donno what happens to propriarary legacy software
All this hysteria over nuclear weapons is overblown. We’ve known how to build them for 75 years yet there hasn’t been a single one detonated on inhabited American soil. They’re harmless
It’s called the prevention paradox: It’s when an issue is so severe that it is prevented with proactive action, so no real consequenses are felt so people think it wasn’t severe in the first place.
Y2K specifically makes no sense though. Any reasonable way of storing a year would use a binary integer of some length (especially when you want to use as little memory as possible). The same goes for manipulations; they are faster, more memory efficient, and easier to implement in binary. With an 8-bit signed integer counting from 1900, the concerning overflows would occur in 2028, not 2000. A base 10 representation would require at least 8 bits to store a two digit number anyway. There is no advantage to a base 10 representation, and there never has been. For Y2K to have been anything more significant than a text formatting issue, a whole lot of programmers would have had to go out of their way to be really, really bad at their jobs. Also, usage of dates beyond 2000 would have increased gradually for decades leading up to it, so the idea it would be any sort of sudden catastrophe is absurd.
Oh boy you heavily underestimate the amount and level of bad decision in legacy protokoll. Read up in the toppic. the Date was for a loong time stored as 6 decimal numbers.
Similar with Y2K — it was only a nothingburger because it was taken seriously, and funded well. But the narrative is sometimes, “yeah lol it was a dud.”
The question is, what will happen in 2038 when y2k happens again due to an integer overflow? People are already sounding the alarm but who knows if people will fix all of the systems before it hits.
2038 is approaching super fast and nobody seems to care yet
At the rate of one year per year, even.
For each second that passes we’re one second closer to 2038
Except for leap seconds. Time is the worst to work with :(
AfaIk that’s not entirely true, e.g. Debian is changing the system time from 32 bit integer to 64 bit. Thus I assume other distros do this as well. However, this does not help for industrial or IOT devices running deprecated Unix / Linux derivatives.
It’s already been addressed in Linux - not sure about other OSes. They doubled the size of time data so now you can keep using it until after the heat death of the universe. If you’re around then.
Finally it’d be the year of desktop linux with all the windows users die off
debian for example is atm at work recompiling everything vom 32bit to 64bit timestamps (thanks to open source this is no problem) donno what happens to propriarary legacy software
All this hysteria over nuclear weapons is overblown. We’ve known how to build them for 75 years yet there hasn’t been a single one detonated on inhabited American soil. They’re harmless
Yeah but not all people live on American soil…
It’s the American tradition to ignore that
You even dropped a few accidentally and nothing happened! Complete duds these things really
I can’t remember the name but I think this is some kind of paradox.
Like the preventative measures we’re so effective that they created a perception that there was no risk in the first place.
It’s called the prevention paradox: It’s when an issue is so severe that it is prevented with proactive action, so no real consequenses are felt so people think it wasn’t severe in the first place.
Y2K specifically makes no sense though. Any reasonable way of storing a year would use a binary integer of some length (especially when you want to use as little memory as possible). The same goes for manipulations; they are faster, more memory efficient, and easier to implement in binary. With an 8-bit signed integer counting from 1900, the concerning overflows would occur in 2028, not 2000. A base 10 representation would require at least 8 bits to store a two digit number anyway. There is no advantage to a base 10 representation, and there never has been. For Y2K to have been anything more significant than a text formatting issue, a whole lot of programmers would have had to go out of their way to be really, really bad at their jobs. Also, usage of dates beyond 2000 would have increased gradually for decades leading up to it, so the idea it would be any sort of sudden catastrophe is absurd.
Look some info on BCD or EBCDIC.
Oh boy you heavily underestimate the amount and level of bad decision in legacy protokoll. Read up in the toppic. the Date was for a loong time stored as 6 decimal numbers.