Gotta disagree with you about Y2K. While not dealing with it might not have been catastrophic, it certainly would've been extremely damaging, and that was only averted by spending a lot of time and money fixing a lot of software.
Almost nothing was spent outside the Anglosphere, and very little by lots of schools and small businesses. They had no problems. i called this out through 1999 and i was right.
Y2K was likely overblown. But it was fully credible in the banking sector, which relies (overmuch, IMO) on legacy software, sometimes still dating back to the 1960's and 1970's. The PC revolution occurred in the 1980's, with all apps newly-written for the new machine types. Memory was (comparatively) cheap by then, so two digits weren't much of a problem. I can see why schools and small businesses were unaffected by Y2K. Banks had reason to fear.
I personally worked on fixing a few systems that would otherwise have failed _very_ expensively. Don't forget there wasn't the same penetration of computers in small businesses as there is now.
Lots of big company software was worked on. I was involved in two different Y2K software projects.
The PC accounting package I used had two Y2K releases, because the first one still had Y2K bugs.
The company I now work at has systems with very visible pre-Y2K date storage in 5 digits (yy & day of year) plus the fixed 8 digit date storage for 4 digit year - signs of Y2k upgrade.
In 1990 I was required to build a system that did not support dates in the 2000's (bad contract specification by Australia Post).
It was easy to get it wrong & have 29 Feb 2000 not work, because accepting & validating a date as being 19xx & later correcting the century didn't work: there was no 29 Feb 1900.
On the other hand Australian management got excited about checking that elevators, cars and refrigerators would still work.
If your software was recent or off the shelf, you were mostly ok -the developers or vendors dealt with it.
Incidentally, I saw the Unix 2038 problem (32 bit dates go negative if incorrectly treated as signed) at work in 2003 because of a 35 year contract.
ARM, the more-or-less benevolent hegemon that designed the 280 billion processors used in most of the world's IT hardware and smart devices, is publicly upbeat about continued large efficiency gains. A press release from April cites gains of 25%-50% in recently announced standard cloud processors, and a staggering 25X improvement in one particular AI optimisation package from Nvidia. The examples are cherry-picked of course, but the release is signed by the CEO, and ARM didn't get where it is today by lying to its technically savvy customers about its products or guessing wrong on industry trends. https://newsroom.arm.com/blog/driving-ai-datacenter-compute-efficiency
I am with Dave Irving, above, on the Y2K issue. I managed the technical efforts of both finance and public broadcasting organisations for Y2K compliance. The work was necessary for the finance industry because of the vast amount of Cobol code in banking and the rather obvious maths issue. So I must protest that it was not a rort, but OK, it certainly was not the “planes falling from the sky” disaster-prone event everyone feared.
The benefits of broadcasting were less well recognised, and I can concede it was a bit of a “try-on” but a much-needed one. Back then, the Budgets they gave us to combat the Y2K bug were significant and allowed us Techs to build some decent infrastructure into their broadcasting (and finance technology in the Stock Exchange). It was needed in Broadcasting not so much for the maths issue but to future-proof the technology from the cuts we knew would (and did) follow once that funding dissipated. We justified a lot of rebuilds with that funding, and it was fortuitous. Public broadcasting was right in expecting the axe to come crashing down thereafter, but I just don't believe we ever realised how deep the cutting would grow to be. Y2K was a justifiable expense in public broadcasting because successive governments would strip funding to make it impossible to keep the tech up to date otherwise.
I've just done an FB post complaining about the Green puritanism that has led to a scandal over lack of proper air-conditioning in the newly-built Paris Olympic village. This has led to the teams of rich countries paying for expensive optional a/c units, while those of poor countries sweat it out in a heatwave. The athletes are all young and supremely fit, so probably none of them will die from the heat - unlike thousands of old French people every summer, and those who will rent or buy the Olympic village flats.
Green puritans have not registered the facts that renewable electricity from wind, sun and storage is not just cheap, it is unlimited for practical purposes, and that heat pump air conditioners are efficient and affordable. If AI vendors want to build huge data centres for more of it, there is no ecological reason to stop them, provided they source their electricity supply sustainably, as leading companies like Amazon, Google and Microsoft have promised to do.
I think it is only perceived to be up to green politics because mainstream politics failed and still largely fails to step up - and keeping the focus on extremists and their unreasonable demands keeps the focus away from that failure of reasoned efforts, especially by those holding the highest Offices with the duties of care.
Achieving an abundance of low emissions energy that makes even extravagant wastefulness into low emissions seems much more likely to have wide support than calling for people to go without. Neither is likely to be sufficient but better to fail aiming for zero emissions abundance than aiming for enforced frugality. I don't think calling for frugality is wrong, just ineffective - a tactical miscalculation, but when environmentalists had the issue handed to them in hot potato style - "you care, you fix it" - I think any intent was not to raise green politics up but to let them be judged by the failures of their "alternative energy" options. Instead Green gained credibility and became less fringe. One of the better miscalculations imo.
I really do not understand this logic at all from Trump et al. Let us imagine that by 2034, AI is using half of America's energy. I very much hope this is not the case, because the only situation in which I can imagine a thing being is if there is some concerted effort to trigger a singularity- let's say it happens though. AI training does not need to happen during the night time, and wind power is perfectly suitable. New build wind and solar are substantially cheaper than coal and gas, and getting more so! If we really do need cosmic amounts of power, this is an argument for more renewables investment!
Gotta disagree with you about Y2K. While not dealing with it might not have been catastrophic, it certainly would've been extremely damaging, and that was only averted by spending a lot of time and money fixing a lot of software.
Almost nothing was spent outside the Anglosphere, and very little by lots of schools and small businesses. They had no problems. i called this out through 1999 and i was right.
Y2K was likely overblown. But it was fully credible in the banking sector, which relies (overmuch, IMO) on legacy software, sometimes still dating back to the 1960's and 1970's. The PC revolution occurred in the 1980's, with all apps newly-written for the new machine types. Memory was (comparatively) cheap by then, so two digits weren't much of a problem. I can see why schools and small businesses were unaffected by Y2K. Banks had reason to fear.
I personally worked on fixing a few systems that would otherwise have failed _very_ expensively. Don't forget there wasn't the same penetration of computers in small businesses as there is now.
Lots of big company software was worked on. I was involved in two different Y2K software projects.
The PC accounting package I used had two Y2K releases, because the first one still had Y2K bugs.
The company I now work at has systems with very visible pre-Y2K date storage in 5 digits (yy & day of year) plus the fixed 8 digit date storage for 4 digit year - signs of Y2k upgrade.
In 1990 I was required to build a system that did not support dates in the 2000's (bad contract specification by Australia Post).
It was easy to get it wrong & have 29 Feb 2000 not work, because accepting & validating a date as being 19xx & later correcting the century didn't work: there was no 29 Feb 1900.
On the other hand Australian management got excited about checking that elevators, cars and refrigerators would still work.
If your software was recent or off the shelf, you were mostly ok -the developers or vendors dealt with it.
Incidentally, I saw the Unix 2038 problem (32 bit dates go negative if incorrectly treated as signed) at work in 2003 because of a 35 year contract.
ARM, the more-or-less benevolent hegemon that designed the 280 billion processors used in most of the world's IT hardware and smart devices, is publicly upbeat about continued large efficiency gains. A press release from April cites gains of 25%-50% in recently announced standard cloud processors, and a staggering 25X improvement in one particular AI optimisation package from Nvidia. The examples are cherry-picked of course, but the release is signed by the CEO, and ARM didn't get where it is today by lying to its technically savvy customers about its products or guessing wrong on industry trends. https://newsroom.arm.com/blog/driving-ai-datacenter-compute-efficiency
I am with Dave Irving, above, on the Y2K issue. I managed the technical efforts of both finance and public broadcasting organisations for Y2K compliance. The work was necessary for the finance industry because of the vast amount of Cobol code in banking and the rather obvious maths issue. So I must protest that it was not a rort, but OK, it certainly was not the “planes falling from the sky” disaster-prone event everyone feared.
The benefits of broadcasting were less well recognised, and I can concede it was a bit of a “try-on” but a much-needed one. Back then, the Budgets they gave us to combat the Y2K bug were significant and allowed us Techs to build some decent infrastructure into their broadcasting (and finance technology in the Stock Exchange). It was needed in Broadcasting not so much for the maths issue but to future-proof the technology from the cuts we knew would (and did) follow once that funding dissipated. We justified a lot of rebuilds with that funding, and it was fortuitous. Public broadcasting was right in expecting the axe to come crashing down thereafter, but I just don't believe we ever realised how deep the cutting would grow to be. Y2K was a justifiable expense in public broadcasting because successive governments would strip funding to make it impossible to keep the tech up to date otherwise.
I've just done an FB post complaining about the Green puritanism that has led to a scandal over lack of proper air-conditioning in the newly-built Paris Olympic village. This has led to the teams of rich countries paying for expensive optional a/c units, while those of poor countries sweat it out in a heatwave. The athletes are all young and supremely fit, so probably none of them will die from the heat - unlike thousands of old French people every summer, and those who will rent or buy the Olympic village flats.
Green puritans have not registered the facts that renewable electricity from wind, sun and storage is not just cheap, it is unlimited for practical purposes, and that heat pump air conditioners are efficient and affordable. If AI vendors want to build huge data centres for more of it, there is no ecological reason to stop them, provided they source their electricity supply sustainably, as leading companies like Amazon, Google and Microsoft have promised to do.
In theory the same would hold for honest crypto, but since the sleazy industry in reality does not source sustainably and its product is socially worthless, there is a strong case for a heavy Pigovian tax on the bubble before it pops. Meanwhile, and for the foreseeable future, it wastes far more electricity than AI. https://www.statista.com/statistics/1462943/global-electricity-demand-from-data-centers-and-crypto-forecast
I think it is only perceived to be up to green politics because mainstream politics failed and still largely fails to step up - and keeping the focus on extremists and their unreasonable demands keeps the focus away from that failure of reasoned efforts, especially by those holding the highest Offices with the duties of care.
Achieving an abundance of low emissions energy that makes even extravagant wastefulness into low emissions seems much more likely to have wide support than calling for people to go without. Neither is likely to be sufficient but better to fail aiming for zero emissions abundance than aiming for enforced frugality. I don't think calling for frugality is wrong, just ineffective - a tactical miscalculation, but when environmentalists had the issue handed to them in hot potato style - "you care, you fix it" - I think any intent was not to raise green politics up but to let them be judged by the failures of their "alternative energy" options. Instead Green gained credibility and became less fringe. One of the better miscalculations imo.
I really do not understand this logic at all from Trump et al. Let us imagine that by 2034, AI is using half of America's energy. I very much hope this is not the case, because the only situation in which I can imagine a thing being is if there is some concerted effort to trigger a singularity- let's say it happens though. AI training does not need to happen during the night time, and wind power is perfectly suitable. New build wind and solar are substantially cheaper than coal and gas, and getting more so! If we really do need cosmic amounts of power, this is an argument for more renewables investment!