Important warning about the latest Windows Update - do not install!

I had a look at my Windows Update History and that particular one KB5063878 is not listed. I did very recently do a mass data transfer to two SSD USB drives without any problems. So I have either dodged a bullet or my system does not have the hardware that requires the update.
 
My HP laptop, that I'm using while away down south in soggy Sydney, has this update. I was installing all the patches I could find after my 24H2 shambolic install on both my PC tower and the laptop.
So far it hasn't caused any issues but I haven't been moving large files around either. The hynix drive isn't on any suspect list I've seen.
I was toying with the idea of downloading the latest TRS22+ to while away another wet day but maybe I'll read a book instead. :)
 
That update has been on my Dell for 3+ days. No problems with my SSD, but haven't done any 50GB+ transfers.

Removed now. Paused updates.

After reading about the potential problem, I was preparing to do a huge transfer of files ... perhaps creating the situation that triggers the problem!

Thank you John for sharing the warning.
 


Luckily, I didn’t need to do any of that registry stuff.

Some time after uninstalling the problem update, the system notified me that I needed to look at something in my update settings (presumably it detected that I didn’t have the latest update installed). Only then did the Update control panel include a Pause Updates button. It had options for 1, 2, 3, 4, or 5 weeks. I opted for 4 weeks, hoping that it will be sufficient time for Microsoft to come out with a better update.

It’s annoying and confusing that this Pause button is not displayed all the time, but at least it does appear eventually.
 
Last edited:
Am I correct that you lot are talking about a windows11 release, that was released about 12 months ago.???

A bit like shutting the barn door after the horse has bolted!
 
Latest news on this issue. Phison not able to reproduce the claimed SSD failures;

Seems to make sense what was discovered. Overheating SSDs that were not setup with proper ventilation in a case, or no cooling heatsink. A big file transfer will tend to make the SSD warm up. That is more relief to me as I never seem to exceed 45 Deg. C on my setup.
 
Seems to make sense what was discovered. Overheating SSDs that were not setup with proper ventilation in a case, or no cooling heatsink. A big file transfer will tend to make the SSD warm up. That is more relief to me as I never seem to exceed 45 Deg. C on my setup.
Thank you for posting an update on this. I never experienced an issue with my system while running that update. If it is heat related, that won't affect me because like you I keep my systems as cool as possible.

I can see heat being an issue with those new NVME external enclosures and with servers that are not kept in optimal environments, or many computers that are not adequately cooled due to many reasons included clogged fans and ventilation openings, or poor design. This issue is most likely also seen with systems being forced to run graphics intensive games or programs that really heat up the inside of their poorly ventilated cases.

From my IT experience, servers have been tucked in closets with zero ventilation and are run 24/7 365 in that environment. This was one of the issues that we solved when the company was moved by setting up a dedicated real computer room to house all of our servers, TELCO, and network equipment with a dedicated HVAC and power separate from the rest of the office where A/C was turned off when no one was in the building.

From my experience as a tech, devices such as SSDs, at the time I dealt with their great grandparents EEROMs and EPROMs, tended to be more heat sensitive than other components. In the olden days, we could reproduce failures by heating up the components with a heat gun, like those used to peel paint, then follow up with a cool down with freeze spray, or Freon. This was before the Freon limitations in the early to mid-1980s.
 
This was one of the issues that we solved when
Sorry for the off-topic, but When I was IS Coordinator for the Idaho Region Timberlands, the servers were in my office (fortunately a large room), with an old window air conditioner. They agreed that was not adequate, so installed a dedicated outside unit with vents in the wall, and a remote control. The only problem was, the town we were in did not have clean power, and any time the power blipped, the new AC unit went off and did not come back on except by turning it back on. So of course every time I left my office I had to turn on the old window air conditioner just in case.... :rolleyes:
 
Sorry for the off-topic, but When I was IS Coordinator for the Idaho Region Timberlands, the servers were in my office (fortunately a large room), with an old window air conditioner. They agreed that was not adequate, so installed a dedicated outside unit with vents in the wall, and a remote control. The only problem was, the town we were in did not have clean power, and any time the power blipped, the new AC unit went off and did not come back on except by turning it back on. So of course every time I left my office I had to turn on the old window air conditioner just in case.... :rolleyes:
That's not unusual and is very much like the second location where the company moved to. It was there the computer room was a former bathroom. A large window air conditioner was placed in the window and that was used to cool down the servers. Like you, I had a remote control and had to watch the city power because it had major brownouts and the A/C unit would turn off.

The company by this point was closing, they lasted another 18 months and we dealt with that until the end. It was sad going from a dedicated room to a former bathroom but I made do the best I could.
 
That's not unusual and is very much like the second location where the company moved to. It was there the computer room was a former bathroom. A large window air conditioner was placed in the window and that was used to cool down the servers. Like you, I had a remote control and had to watch the city power because it had major brownouts and the A/C unit would turn off.

The company by this point was closing, they lasted another 18 months and we dealt with that until the end. It was sad going from a dedicated room to a former bathroom but I made do the best I could.
I took a job offer at the company headquarters in Boise starting January 2nd, and on the 12th of January they announced the Idaho Region Timberlands was closing. People thought I had inside information, but it was a career move, and I could see the handwriting on the wall as well. Turned out to be perfect timing!
 
This is getting more interesting. Maybe finger pointing should be to motherboard or case manufacturers. They have part of the say in how SSDs get placed and cooled. (I am being sarcastic, by the way)
No reason to be sarcastic here and a perfect reason to point to hardware and case manufacturers. When a manufacturer such as HP designs laptops, they use the same case for all laptop designs regardless of whether the machine uses a low end i3 with 8 GB of RAM and comes with a 256GB NVME drive, or a high-end setup that comes with an i9-14900K a RTX4090, 64 GB of RAM and has dual 4TB NVME drives.

The case design is fine for the i3 machine but definitely is not adequate for the i9 configuration. I used HP here, but we can apply that to DELL, Lenovo, MSI, ASUS, ACER, and many other manufacturers.

The same can be said about desktops. Many desktops have fancy glass fronts to show off the spinny fans and bright glowing LEDs, but do nothing to cool the components inside adequately even with liquid cooling for the CPU. The AIO has no way of cooling off the liquid due to being immersed in the heat inside the case. Warm or hot ambient air, combined with the heat inside the case will eventually degrade the components substantially leading to unstable operations because there's no way to remove the heat from inside the case.

SSDs and other silicon are very sensitive to heat. They may operate fine initially, but as time goes on the heat takes its toll on the chips as the junctions begin to fail and bond wires begin to disconnect from the leads. These kinds of failures can be seen with video cards that work fine while they're cool but exhibit random green and magenta blocks when put under stress.
 
No reason to be sarcastic here and a perfect reason to point to hardware and case manufacturers. When a manufacturer such as HP designs laptops, they use the same case for all laptop designs regardless of whether the machine uses a low end i3 with 8 GB of RAM and comes with a 256GB NVME drive, or a high-end setup that comes with an i9-14900K a RTX4090, 64 GB of RAM and has dual 4TB NVME drives.

The case design is fine for the i3 machine but definitely is not adequate for the i9 configuration. I used HP here, but we can apply that to DELL, Lenovo, MSI, ASUS, ACER, and many other manufacturers.

The same can be said about desktops. Many desktops have fancy glass fronts to show off the spinny fans and bright glowing LEDs, but do nothing to cool the components inside adequately even with liquid cooling for the CPU. The AIO has no way of cooling off the liquid due to being immersed in the heat inside the case. Warm or hot ambient air, combined with the heat inside the case will eventually degrade the components substantially leading to unstable operations because there's no way to remove the heat from inside the case.

SSDs and other silicon are very sensitive to heat. They may operate fine initially, but as time goes on the heat takes its toll on the chips as the junctions begin to fail and bond wires begin to disconnect from the leads. These kinds of failures can be seen with video cards that work fine while they're cool but exhibit random green and magenta blocks when put under stress.
The HP thing does make sense, quite a few of the models look very similar or same as each other.
 
Back
Top