• ihatetheworld@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    2 months ago

    I am perfectly happy with my 6900xt. Running it at 220w limit which still manage 20k graphical score in Time-spy. The trade-off was 10% performance vs 350w for 37% power reduction and roughly 5% performance lost vs 300w for 27% power reduction.

    Previously was running tuned for 300w but have since switched to 220w for good. Performance lost in game is not noticeable but the reduction from 220w-250w max power draw to 150-170w is huge. PC is completely silent and i can game alot longer without having to turn on the AC.

    7800 XT is a good purchase since it is pretty much a better version of 6900xt out of the box. (Not comparing with those xtxh 6900xt that are throwing away power efficiency for that additional 10% performance vs running at 350w limit.

  • jaxiiruff@lemmy.zip
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Already bought mine months ago and have been enjoying it, not so much the ROCM side of things but eh.

    • billhead@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      2 months ago

      Are there incompatibilities or performance issues?

      I’m planning on upgrading in the near future and start messing with stable diffusion and other AI projects and I’m running Linux (I use Arch, btw) so I was leaning towards AMD instead of Nvidia.

      • jaxiiruff@lemmy.zip
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        Decent, for now until ROCM & ZLUDA improve. I use NixOS and run my AI stuff using docker containers as its the easiest way imo because of how fucked up the dependencies are for ROCM especially.

        Basically to get AMD working for this stuff right now is to make sure certain versions of ROCM for certain versions of projects interacting with certain versions of pytorch all like each other. The most dependency hell of all dependency hells.

        So most projects have a hell of a time supporting ROCM so you must use alternative forks mostly, and even if there is a ROCM version it is so hardly used that no one knows if it works or if it doesnt half the time. I will say you will have the EASIEST time by far if you use a 7900 XT because most things are built to support that card. Otherwise good luck. Get used to using environment variables such as:

        HSA_OVERRIDE_GFX_VERSION=11.0.0 (Or 10.3.0 if that one doesnt work. I use 11.0.1, these are codes for GPU’s supported by ROCM incase yours isnt supported)

        TL:DR - its all a big mess right now but it does work if you fuck with it a bunch, I got my 7800 XT to work nicely with Ollama + OpenWebUI for text generation. For stable diffusion its definitely a shit show atleast for my preferred UI Invoke AI. Doesnt work at all it only uses my CPU (also AMD so maybe some fuckery.) However I dont regret it as AMD is truly the best especially on Linux but definitely not for AI as it currently stands.

    • Lucy :3@feddit.org
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      If the top 8000 card is at least as good as the 7900 XTX, yes. But as they said they won’t have any top-level GPUs … I’m not confident it will be. I would’ve waited for more information before building a new PC, just as I waited till Ryzen 9000 and x870(e) were released. But honestly, I’m just going to gamble and get the components asap, because there is no real information about RX 8000 or 79950 yet anyway.