• 0 Posts
  • 46 Comments
Joined 8 months ago
cake
Cake day: April 3rd, 2024

help-circle
  • This is a case of you having some very specific requirements that can only be met in a certain way, that being Windows in this case. Whether or not a switch makes sense depends on how important those requirements are to you. Seems perfectly reasonable to me.

    I personally found the ability to override a game’s rendering settings to only be worth it in very few cases but that’s me. But if you use it a lot then you use it a lot.

    As for AI upscaling, my main issue there is that Nvidia chose a name so generic that it’s hard to google. And then they made a second unrelated feature with a very similar name.

    There is AI video upscaling for Linux but it probably doesn’t work quite the same way Nvidia’s offering does. That might be a problem or it might not; I admittedly only invested a minute to look it up so I don’t have any details.

    The same applies to SDR-to-HDR. There seems to be something but it probably doesn’t work like what you currently use.

    So in the end you’ll have to decide whether you’d be more annoyed by not having those features or by having to use whatever zany shit Microsoft come up with. Not a great decision but that’s life.

    I personally might have stuck with Windows longer on my desktop if my 4080 hadn’t turned out to be wonky and Nvidia’s driver hadn’t turned out to be so capricious that I had to spend two months ruling out plausible error causes. That drove me back to AMD, which made the switch easy. But again, that’s me and not you.


  • Ah, the old Nvidia problem. It’s true that Nvidia’s Linux driver isn’t very good (although I don’t think their Windows driver is very good either, it just has more features).

    The 3D Settings page is specific to the Nvidia Windows driver. Even an AMD user might’ve been slightly confused (although AMD ships comparable features, just located elsewhere under a different name). This is indeed something the Linux drivers plain don’t have in that form, although I can’t remember the last time I felt a need to really muck around in there.

    Admittedly, overriding game rendering behavior might not even always be possible, seeing that DirectX games are run through a translation layer before the GPU gets to do anything.

    I wasn’t able to find solid info for AI upscaling even on Windows, mainly because of the terrible name of that feature and because Nvidia offers both “AI Upscaling” and “Nvidia Image Scaling” and I have no idea if those are the same thing. The former seems to be specific to the Nvidia SHIELD.

    Unless you’re talking about DLSS, which is supported.

    The HDR one is odd but might again be related to the Nvidia driver not being very good. This should improve in the future but they are admittedly trailing behind.


  • That’s less of an issue these days. In the 2000s it was like that, especially since people used all sorts of add-in cards. These days a lot of those cards have merged with the mainboard (networking, sound, USB) or have fallen out of fashion (e.g. TV tuners).

    The mainboard stuff is generally well-supported. The days of the Winmodem are over. The big issues these days are special-purpose hardware (which generally doesn’t work with later Windows versions either), laptops, and Nvidia GPUs (which are getting better).


  • Tossing Gentoo onto an old Pentium III box, typing emerge world and coming back four hours later to see if it’s done was awesome.

    And no, it wasn’t done compiling KDE yet.

    But I definitely wouldn’t want to experiment with Linux on my only PC with no way to look things up if I break networking (or the whole system). Thankfully, this is no longer an issue in the age of smartphones.




  • Flatpak has its benefits, but there are tradeoffs as well. I think it makes a lot of sense for proprietary software.

    For everything else I do prefer native packages since they have fewer issues with interop. The space efficiency isn’t even that important to me; even if space issues should arise, those are relatively easy to work around. But if your password manager can’t talk to your browser because the security model has no solution for safe arbitrary IPC, you’re SOL.





  • Jesus_666@lemmy.worldtoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    Oh yeah, the equation completely changes for the cloud. I’m only familiar with local usage where you can’t easily scale out of your resource constraints (and into budgetary ones). It’s certainly easier to pivot to a different vendor/ecosystem locally.

    By the way, AMD does have one additional edge locally: They tend to put more RAM into consumer GPUs at a comparable price point – for example, the 7900 XTX competes with the 4080 on price but has as much memory as a 4090. In systems with one or few GPUs (like a hobbyist mixed-use machine) those few extra gigabytes can make a real difference. Of course this leads to a trade-off between Nvidia’s superior speed and AMD’s superior capacity.


  • Jesus_666@lemmy.worldtoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 month ago

    These days ROCm support is more common than a few years ago so you’re no longer entirely dependent on CUDA for machine learning. (Although I wish fewer tools required non-CUDA users to manually install Torch in their venv because the auto-installer assumes CUDA. At least take a parameter or something if you don’t want to implement autodetection.)

    Nvidia’s Linux drivers generally are a bit behind AMD’s; e.g. driver versions before 555 tended not to play well with Wayland.

    Also, Nvidia’s drivers tend not to give any meaningful information in case of a problem. There’s typically just an error code for “the driver has crashed”, no matter what reason it crashed for.

    Personal anecdote for the last one: I had a wonky 4080 and tracing the problem to the card took months because the log (both on Linux and Windows) didn’t contain error information beyond “something bad happened” and the behavior had dozens of possible causes, ranging from “the 4080 is unstable if you use XMP on some mainboards” over “some BIOS setting might need to be changed” and “sometimes the card doesn’t like a specific CPU/PSU/RAM/mainboard” to “it’s a manufacturing defect”.

    Sure, manufacturing defects can happen to anyone; I can’t fault Nvidia for that. But the combination of useless logs and 4000-series cards having so many things they can possibly (but rarely) get hung up on made error diagnosis incredibly painful. I finally just bought a 7900 XTX instead. It’s slower but I like the driver better.




  • Anduril is way overengineered. I like this UI that some of my lights have:

    While off:

    • One push: Turn on at the last used brightness.
    • Two pushes: Turn on at maximum brightness.
    • Three pushes: That strobe mode that you don’t need but seems to be obligatory.
    • Hold: Turn on at the lowest brightness (or moonlight mode if the light has one).

    While on:

    • One push to turn off.
    • Two pushes to toggle between maximum brightness and the last used “regular” brightness.
    • Three: That strobe mode that someone has to have some use for.
    • Hold: Alternately increase or decrease the brightness.

    That’s pretty easy to learn and gives you all the functions you’d reasonably need (plus that strobe) without a lot of clutter.


  • They did PR campaigns against Linux and OpenOffice for quite some time – until cloud computing took off and it turned out they could earn more money by supporting Linux than by fighting it.

    In fact, Microsoft weren’t happy about FOSS in general. I can still remember when they tried to make “shared source” a thing: They made their own ersatz OSI with its own set of licenses, some of which didn’t grant proper reuse rights – like only allowing you to use the source code to write Windows applications.


  • True, although that has happened with F/OSS as well (like with xz or the couple times people put Bitcoin miners into npm packages). In either case it’s a lot less likely than the software simply ceasing to be supported, becoming gradually incompatible with newer systems, and rotting away.

    Except, of course, that I can pick up the decade-old corpse of an open source project and try to make it work on modern systems, despite how painful it is to try to get a JavaFX application written for Java 7 and an ancient version of Gradle to even compile with a recent JDK. (And then finally give up and just run the last Windows release with its bundled JRE in Wine. But in theory I could’ve made it work!)


  • Note that this specifically talks about proprietary platforms. Locally-run proprietary freeware has entirely different potential issues, mostly centered around the developer stopping to maintain it. Locally-run F/OSS has similar issues, actually, but lessened by the fact that someone might later pick up the project and continue it.

    Admittedly, platforms are very common these days because the web is an easily accessible cross-platform GUI toolkit SaaS is more easily monetized.