• 0 Posts
  • 148 Comments
Joined 1 year ago
cake
Cake day: April 3rd, 2024

help-circle

  • To quote that same document:

    Figure 5 looks at the average temperatures for different age groups. The distributions are in sync with Figure 4 showing a mostly flat failure rate at mid-range temperatures and a modest increase at the low end of the temperature distribution. What stands out are the 3 and 4-year old drives, where the trend for higher failures with higher temperature is much more constant and also more pronounced.

    That’s what I referred to. I don’t see a total age distribution for their HDDs so I have no idea if they simply didn’t have many HDDs in the three-to-four-years range, which would explain how they didn’t see a correlation in the total population. However, they do show a correlation between high temperatures and AFR for drives after more than three years of usage.

    My best guess is that HDDs wear out slightly faster at temperatures above 35-40 °C so if your HDD is going to die of an age-related problem it’s going to die a bit sooner if it’s hot. (Also notice that we’re talking average temperature so the peak temperatures might have been much higher).

    In a home server where the HDDs spend most of their time idling (probably even below Google’s “low” usage bracket) you probably won’t see a difference within the expected lifespan of the HDD. Still, a correlation does exist and it might be prudent to have some HDD cooling if temps exceed 40 °C regularly.


  • Hard drives don’t really like high temperatures for extended periods of time. Google did some research on this way back when. Failure rates start going up at an average temperature of 35 °C and become significantly higher if the HDD is operated beyond 40°C for much of its life. That’s HDD temperature, not ambient.

    The same applies to low temperatures. The ideal temperature range seems to be between 20 °C and 35 °C.

    Mind you, we’re talking “going from a 5% AFR to a 15% AFR for drives that saw constant heavy use in a datacenter for three years”. Your regular home server with a modest I/O load is probably going to see much less in terms of HDD wear. Still, heat amplifies that wear.

    I’m not too concerned myself despite the fact that my server’s HDD temps are all somewhere between 41 and 44. At 30 °C ambient there’s not much better I can do and the HDDs spend most of their time idling anyway.







  • “Legally required”, so they’re seeing it in the local laws. Some countries require websites to disclose who operates them.

    For example, in Germany, websites are subject to the DDG (Digitale-Dienste-Gesetz, “digital services law”). Under this law they are subject to the same disclosure requirements as print media. At a minimum, this includes the full name, address, and email address. Websites updated operated by companies or for certain purposes can need much more stuff in there.

    Your website must have a complete imprint that can easily and obviously be reached from any part of the website and is explicitly called “imprint”.

    These rules are meaningless to someone hosting a website in Kenya, Australia, or Canada. But if you run a website in Germany you’d better familiarize yourself with them.


  • I work for a publicly traded company.

    We couldn’t switch away from Microsoft if we wanted to because integrating everything with Azure and O365 is the cheapest solution in the short term, ergo has the best quarterly ROI.

    I don’t think the shareholders give a rat’s ass about data sovereignty if it means a lower profit forecast. It’d take legislative action for us to move away from an all-Azure stack.

    And yes, that sucks big time. If Microsoft stops playing nice with the EU we’re going to have to pivot most of our tech stack on a moment’s notice.




  • Leopard and Snow Leopard had vastly better virtual desktops than Lion onward. You actually had a grid of them and could navigate up/down/left/right with shortcuts; afterwards you only got a linear list of desktops.

    Gridded desktops were great. I had a 3x3 grid, of which five cells were used. My main desktop was “centered”. Thunderbird was right. My IRC and IM clients were left. iTunes was down. I don’t remember what was up; it’s been a while.


  • My most used features so far are vertical splitters, vertical nudging, and the new placement modes for conveyors and pipes. With an honorable mention going to conveyor wall holes, which also free up a lot of design options.

    Honestly, though, just about everything in this update has been a godsend. Priority splitters are the only thing I haven’t really used yet. Even the elevators rock; being able to zoop up to 200 meters up or down in one go can make them useful even as a temporary yardstick for tall structures. (Also, I did end up needing to go 150 meters straight down to get at some resources and can confirm that elevators handle their intended purpose very well.)


  • Do you want a prediction? The current cost of graphic cards will crash the classic PC gaming market. There are some enthusiasts who are buying cards for thousands of dollars or building 4.000€ computers. But the majority of gamers will stay on their laptops or might go for cheaper devices like the SteamDeck. But if your game needs more power, needs a modern graphic card and a beefier PC, there are fewer and fewer people who can run it and many people can’t afford it. So devs will target lower system specs with to reach the bigger audience

    Also, there’s not as much value in high-powered GPUs right now because these days high-end graphics often mean Unreal Engine 5. UE5 is excellent for static and slow-moving graphics but has a tendency towards visible artifacts in situations where the picture and especially the camera position changes quickly (especially since it’s heavily reliant on TAA). These artifacts are largely independent of how good your GPU is.

    Unlike in previous generations, going for high-end graphics doesn’t necessarily mean you get a great visual experience – your games might look like smeary messes no matter what kind of GPU you use because that’s how modern engines work. Smeary messes with beautiful lighting, sure, but smeary messes nonetheless.

    My last GPU upgrade was from a Vega 56 to a 4080 (and then an XTX when the 4080 turned out to be a diva) and while the newer cards are nice I wouldn’t exactly call them 1000 bucks nice given that most modern games look pretty bad in motion and most older ones did 4K@60 on the Vega already. Given that I jumped three generations forward from a mid-tier product to a fairly high-end one, the actual benefit in terms of gaming was very modest.

    The fact that Nvidia are now selling fancy upscaling and frame interpolation as killer features also doesn’t inspire confidence. Picture quality in motion is already compromised; I don’t want to pay big money to compromise it even further.

    If someone asked me about what GPU to get I’d tell them to get whatever they can find for a couple hundred bucks because, quite frankly, the performance difference isn’t worth the price difference. RT is cool for a couple of days but I wouldn’t spend much on it either, not as long as the combination of TAA and upscaling will hide half of the details behind dithered motion trails and time-delayed shadows.






  • AI isn’t taking off because it took off in the 60s. Heck, they were even working on neural nets back then. Same as in the 90s when they actually got them to be useful in a production environment.

    We got a deep learning craze in the 2010s and then bolted that onto neural nets to get the current wave of “transformers/diffusion models will solve all problems”. They’re really just today’s LISP machines; expected to take over everything but unlikely to actually succeed.

    Notably, deep learning assumes that better results come from a bigger dataset but we already trained our existing models on the sum total of all of humanity’s writings. In fact, current training is hampered by the fact that a substantial amount of all new content is already AI-generated.

    Despite how much the current approach is hyped by the tech companies, I can’t see it delivering further substantial improvements by just throwing more data (which doesn’t exist) or processing power at the problem.

    We need a systemically different approach and while it seems like there’s all the money in the world to fund the necessary research, the same seemed true in the 50s, the 60s, the 80s, the 90s, the 10s… In the end, a new AI winter will come as people realize that the current approach won’t live up to their unrealistic expectations. Ten to fifteen years later some new approach will come out of underfunded basic research.

    And it’s all just a little bit of history repeating.