• 2 Posts
  • 2.02K Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle



















  • Basically, avoid AMD if you’re serious about it. Direct ML just can’t compete with cuda. Performance with stable diffusion on Nvidia blows away AMD. There’s not only performance issues, but often compatibility issues too.

    A 4090 is as fast as it gets for consumer hardware. I’ve got a 3090, and it’s got the same amount of vram as a 4090 (24GB), but no where near as fast. So a 3090/TI would be a good budget option.

    However, if you’re willing to wait, they’re saying Nvidia will be announcing the 5000 series in January. I’m not sure when they’ll release though. Plus there’s the whole stock problems with a new series launch. But the 5090 is rumored to have 32GB vram.