Full Ublock is a mixed bag on mobile because it eats battery/performance, and (if you add all the same filter sources), integrated blockers like Orion’s are just about the same anyway.
- 5 Posts
- 1.89K Comments
Oh heck yeah. I’ve been using it on iOS a ton, and dying for this on Windows/Linux.
Fun trivia: what browser supports HEIFs, JPEG XL AVIF, AV1, all with correctly rendered HDR?
Not Chrome. And not Firefox, nor anything based on them I’ve tried: https://caniuse.com/?search=image+format
brucethemoose@lemmy.worldto
Technology@lemmy.world•Windows 11 just lost 5% market share in two months despite Windows 10 losing support.English
51·11 days agoI like Windows 11. But only as a thoroughly neutered, disposable “secondary” OS to dual boot with Linux, to the extent that I could wipe my Windows partition without a care.
If I had to use Windows 11 as my only OS, I’d pull my hair out. Same with desktop Linux TBH. There’s stuff that’s just painful in both ecosystems.
brucethemoose@lemmy.worldto
Technology@lemmy.world•Windows 11 just lost 5% market share in two months despite Windows 10 losing support.English
22·11 days agoApple’s media support is incredible.
I have one platform where HDR photos/video playback and editing, JpegXL, HEIFs from my camera and such all just work. And it’s definitely not my KDE desktop, nor Windows 11.
brucethemoose@lemmy.worldto
Lemmy Shitpost@lemmy.world•We need to get to the bottom of this
51·11 days agoPuuuuurge

brucethemoose@lemmy.worldto
Technology@lemmy.world•The upgrade argument for desktops doesn't stand up anymoreEnglish
1·11 days agoYeah, probably. I actually have no idea what they charge, so I’d have to ask.
It’s be worth it for a 3090 though, no question.
brucethemoose@lemmy.worldto
Technology@lemmy.world•The upgrade argument for desktops doesn't stand up anymoreEnglish
2·11 days agoThis doesn’t make any sense, especially the 2x 3090 example. I’ve run my 3090 at PCIe 3.0 over a riser, and there’s only one niche app where it ever made any difference. I’ve seen plenty of benches show PCIe 4.0 is just fine for a 5090:
https://gamersnexus.net/gpus/nvidia-rtx-5090-pcie-50-vs-40-vs-30-x16-scaling-benchmarks
1x 5090 uses the same net bandwidth, and half the PCIe lanes, as 2x 3090.
Storage is, to my knowledge, always on a separate bus than graphics, so that also doesn’t make any sense.
My literally ancient TX750 still worked fine with my 3090, though it was moved. I’m just going to throttle any GPU that uses more than 420W anyway, as that’s ridiculous and past the point of diminishing returns.
And if you are buying a 5090… a newer CPU platform is like a drop in the bucket.
I hate to be critical, and there are potential issues, like severe CPU bottlenecking or even instruction support. But… I don’t really follow where you’re going with the other stuff.
brucethemoose@lemmy.worldto
Technology@lemmy.world•The upgrade argument for desktops doesn't stand up anymoreEnglish
50·12 days agoThat’s a huge generalization, and it depends what you use your system for. Some people might be on old threadripper workstations that works fine, for instance, and slaps in a second GPU. Or maybe someone needs more cores for work; they can just swap their CPU out. Maybe your 4K gaming system can make do with an older CPU.
I upgraded RAM and storage just before the RAMpocalypse, and that’s not possible on many laptops. And I can stuff a whole bunch of SSDs into the body and use them all at once.
I’d also argue that ATX desktops are more protected from anti-consumer behavior, like soldered price-gouged SSDs, planned obsolescence, or a long list of things you see Apple do.
…That being said, there’s a lot of trends going against people, especially for gaming:
-
There’s “initial build FOMO” where buyers max out their platform at the start, even if that’s financially unwise and they miss out on sales/deals.
-
We just went from DDR4 to DDR5, on top of some questionable segmentation from AMD/Intel. So yeah, sockets aren’t the longest lived.
-
Time gaps between generations are growing as silicon gets more expensive to design.
-
…Buyers are collectively stupid and bandwagon. See: the crazy low end Nvidia GPU sales when they have every reason to buy AMD/Intel/used Nvidia instead. So they are rewarding bad behavior from companies.
-
Individual parts are more repairable. If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.
-
You can still keep your PSU, case, CPU heating, storage and such. It’s a drop in the bucket cost-wise, but it’s not nothing.
IMO things would be a lot better if GPUs were socketable, with LPCAMM on a motherboard.
-
brucethemoose@lemmy.worldto
Technology@lemmy.world•'Reverse Solar Panel' Generates Electricity at NightEnglish
1·12 days agoYeah, that’d be great. Peltiers would be awesome and everywhere if they were dirt cheap.
brucethemoose@lemmy.worldto
Technology@lemmy.world•Notepad++ Hijacked by State-Sponsored HackersEnglish
15·12 days agoSo what malware got shipped?
brucethemoose@lemmy.worldto
Technology@lemmy.world•The TV industry finally concedes that the future may not be in 8KEnglish
1·12 days agoAwesome, thanks for the info and source.
Yeah, most of my frustration came from JXL/AVIF/HEIF and how linux/Windows browsers, KDE, and Windows 11 don’t seem to support them well. Not a fan of packing HDR into 8-bits with WebP/JPG, especially with their artifacts, though I haven’t messed with PNG yet.
brucethemoose@lemmy.worldto
Technology@lemmy.world•The TV industry finally concedes that the future may not be in 8KEnglish
9·13 days agoAlso, we haven’t even got HDR figured out.
I’m still struggling to export some of my older RAWs to HDR. Heck, Lemmy doesn’t support JPEG XL, AVIF, TIFF, HEIF, nothing, so I couldn’t even post them here anyway. And even then, they’d probably only render right in Safari.
brucethemoose@lemmy.worldto
Technology@lemmy.world•The TV industry finally concedes that the future may not be in 8KEnglish
11·13 days ago8K is theoretically good as “spare resolution,” for instance running variable resolution in games and scaling everything to it, displaying photos with less scaling for better sharpness, clearer text rendering, less flickering, stuff like that.
It’s not worth paying for. Mostly. But maybe some day it will be cheap enough to just “include” with little extra cost, kinda like how 4K TVs or 1440p monitors are cheap now.
brucethemoose@lemmy.worldto
Technology@lemmy.ml•Mozilla is building an AI ‘rebel alliance’ to take on industry heavyweights OpenAI, Anthropic
5·14 days ago…I actually wouldn’t be against this.
But it isn’t even genuine. They’ve forked llama.cpp into a broken clone like about 500 other corporations, instead of just contributing to shit that actually works and is used, and… that’s about it.
That’s about par for the AI industry.
brucethemoose@lemmy.worldto
Asklemmy@lemmy.ml•What's a good slur for people who are all-in on Big Tech / surveillance capitalism / data-harvesting?
22·14 days agoTech Bro.
That’s the popular term. It’s most often applied to tech billionaires, but it covers those who idolize them, too.
I think my favorite tangential application is when Sam Altman had a meeting with some TSMC executives, and they allegedly dismissed him as a “Podcasting Bro”
https://www.nytimes.com/2024/09/25/business/openai-plan-electricity.html
brucethemoose@lemmy.worldto
Games@lemmy.world•Players are returning their Dispatch copies due to Switch censorshipEnglish
3·15 days agoWho fucking cares?
Credit card companies.
And their ad buyers, maybe.
brucethemoose@lemmy.worldto
No Stupid Questions@lemmy.world•Is it possible to cool my body enough to not sweat while exercising?
2·15 days agoDepends how much you have to pay attention.
First off, I am not a fitness expert. YMMV.
But sometimes I do variations of bodyweight exercises in front of a TV, yes.
One day, for example, might be arm day. I sit and do leg curls for biceps. I straight pushups or tricep dips, use a pull-up bar if I have one; even just hanging is great.
Another day might be push up variation day; wide, narrow, inclined different ways, push up and “reach to the sky with one arm,” knee pushups at the end.
Yet another is leg day. Squats, jumping squats, lunges, butt kicks, heel lifts, other positions to get different muscles. Another day may be core, another day is more shoulder/back, and so on. And all this is without weights, or with at most like a dumbbell or a pull up bar, and some kind of chair or bed for certain positions.
Your eyes will drift away from the TV, and you get exhausted doing this stuff, but you can keep up with a show if you want.
brucethemoose@lemmy.worldto
No Stupid Questions@lemmy.world•Is it possible to cool my body enough to not sweat while exercising?
1·15 days agoYeah I was being casual, and I’m not an expert by any means.
I bring it up because, for me, sets of specific bodyweight exercises (like legs one day, shoulders/back another, and so on) is just more time efficient. It gives enough resistance to get sore, and gets me exhausted, all in one setting, instead of running separately. It’s easier on my knees, with no risk of shin splints and less risk of injury than heavy weights.
brucethemoose@lemmy.worldto
No Stupid Questions@lemmy.world•Is it possible to cool my body enough to not sweat while exercising?
2·15 days agoCome on, you know what I mean. It’s an indicator exerting yourself. Your blood vessels dilate when you’re hot to try and dump the heat, just like they constrict in parts when cold to save it.



I think they meant background transcoding while using the browser.
I don’t even want to speculate on what’s going wrong there, heh. But I can definitely see that being a quirk.