Old but gold. posting for anybody who hasn’t seen this yet.

  • De Lancre@lemmy.world
    link
    fedilink
    arrow-up
    27
    arrow-down
    6
    ·
    edit-2
    1 year ago

    Honestly, you can downvote me for my opinion, but when we talking about current support from vendors and if you just wanna play damn games — nvidia just works.

    Yes, nvidia lack of support for some features, or sometime they have their time to implement it, like egl for wayland support for example, but god damn, when we talk about smth more simple as playing games, nvidia is just better. You can literally stick bought card in, install blob driver and play. (On notebooks there a bit more hassle and a lot of stuff may not work, like sleep or auto poweroff of gpu for lower power consumption, but good luck find competitor nowadays, lol)

    I have 7900xtx, and it’s fucking pain in the ass. Two (three technically) vulkan drivers, mesa need to be up-to-date to use smth like RT (and it’s still will suck, cause they just started working on RT support like month ago), downvolting do not work and probably will never work, according to some redditor who into amdgpu developing, clock control do not work, some card cant be controled by TDP, there a problem on wayland with VRR, there a two years old bug [1] [2], that cause memclock to stuck to maximum or minimum depending of your display refresh rate: imagine having 7900xtx and get like 20% of it performance, cause gpu don’t feeling like playing today. Oh, and you cant control RGB on the card yet, but that small inconvenience, and soon should be implemented, cause that lack of feature from openRGB, rather then kernel problem. Upd. Last one is a kernel problem, as pointed out for me by user below. Oh well.

    • CalcProgrammer1@lemmy.ml
      link
      fedilink
      arrow-up
      20
      ·
      edit-2
      1 year ago

      The RGB control is a kernel problem not an OpenRGB problem (well, it might also be an OpenRGB problem if the card doesn’t work in Windows either). The amdgpu kernel driver doesn’t expose the i2c interfaces not associated with display connectors, so the i2c interface used for RGB is inaccessible and thus we can’t control RGB on Linux. AMD’s ADL on Windows exposes it just fine.

      That said, I can’t agree that NVIDIA just works. Their drivers are garbage to get installed and keep updated, especially when new kernels come out. Not to mention the terrible Wayland support and lack of Wayland VRR capability. I’m happy with my Arc A770 (whose RGB is controlled over USB and just works, but requires a motherboard header).

      • De Lancre@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        The RGB control is a kernel problem not an OpenRGB problem

        Sorry, rechecked it and yes, you right. [link] Oh well, another one to long list of what do not work as should on amdgpu side, I guess.

      • ProtonBadger@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Their drivers are garbage to get installed and keep updated, especially when new kernels come out

        Sure, but it’s not the case for all Linux distributions? Whenever my Linux distribution have a new kernel it always takes care of the nvidia driver as part of installing the kernel and if there’s a new nvidia driver it installs it after a few days, I never pay much attention to it except for noticing the output from the update.

        • Vilian@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          except when using newer kernels and the nvidia gpu not being updated enough

    • nous@programming.dev
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 year ago

      Back when this statement was made - 11 years ago - nvidia were a lot worst, especially for the kernel developers. A lot has change, and improved in those 11 years.

      But people still like to hang on to the old hate and don’t see or want to see any progress being actually made. I an fairly sure that Linus even said they were not as bad as they used to be. But I cannot find that quote amongst all the results for that one angry statement he made - people and media much prefer to hate on things than actually see things improve.

      • MonkRome@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        While I agree, there are other reasons to hate on them even if they improved in one place… Deceptive marketing, melting cards, poor vender management, etc

        • nous@programming.dev
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          Yeah, but their competitors are not doing much better in those regards either. The whole graphics card industry is doing shitty stuff, hell most mega corps are these days.

    • ghariksforge@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      This is from 10 years ago. Nvidia sucked those days. The demand from machine learning changed all that and forced Nvidia to go open source.

      • CalcProgrammer1@lemmy.ml
        link
        fedilink
        arrow-up
        11
        ·
        1 year ago

        NVIDIA never really went open source…they opened up their kernel drivers to a degree (by moving the majority of the interesting bits into the GPU firmware at that) but the userspace portion (Vulkan, OpenGL, OpenCL, CUDA, etc) is still very much closed source.

    • gens@lemmy.fmhy.ml
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      I got myself an expensive-ish rx 480 because nvidia said they will support vulkan on it. They never implemented vulkan for fermi. Never again will i buy from greedy liars.

      Linus is talking about a different thing entirely. And while their drivers were always great, there is much more to the story then just how well they render 3d.

    • GreyBeard@lemmy.one
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I’ve got a 7900xt and idle power draw and heat generation is off the charts, so I must agressively sleep my computer when not in use. I’ve been hoping for an update to fix it, but nothing yet. And this isn’t really AMDs problem, but a lot of AI stuff just isn’t possible on RDNA 3, because the python libraries don’t support it. Some library updates have started supporting it, but often the tools to make the models work uses old library versions.

      • De Lancre@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Sorry for late response, only notice you right now. For me, idle power draw is about 30w (60w if mem clock bagged out on high clock) on card. It’s worse than it should be (without memclock bug it’s about ~17w), but doable. If you have higher power draw, probably smth else broke.

        • GreyBeard@lemmy.one
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          I’d have to pull out my kill-a-watt to get an accurate reading, but my house grid increases by about .2-.3kw when my PC is on. That doesn’t count all my monitors and whatnot. It is a noticable drain on my houses grid at idle.

    • Vilian@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      the TDP and the 2 years old bug that are being reported more and more as fixed in latest kernels?