this post was submitted on 09 Oct 2024
9 points (76.5% liked)
Stable Diffusion
4977 readers
6 users here now
Discuss matters related to our favourite AI Art generation technology
Also see
- Stable Diffusion Art (See its sidebar for more GenAI Art comms)
- !aihorde@lemmy.dbzer0.com
Other communities
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Basically, avoid AMD if you're serious about it. Direct ML just can't compete with cuda. Performance with stable diffusion on Nvidia blows away AMD. There's not only performance issues, but often compatibility issues too.
A 4090 is as fast as it gets for consumer hardware. I've got a 3090, and it's got the same amount of vram as a 4090 (24GB), but no where near as fast. So a 3090/TI would be a good budget option.
However, if you're willing to wait, they're saying Nvidia will be announcing the 5000 series in January. I'm not sure when they'll release though. Plus there's the whole stock problems with a new series launch. But the 5090 is rumored to have 32GB vram.
Good to know about CUDA/Direct ML.
I found a couple of 2022 posts recommending 3090s, especially since cryptocoin miners were selling lots of them cheap. Thanks for the heads up about the 5000 release, I suspect it will be above my budget but it will net me better deals on a 4090 :P
DirectML sucks but ROCm is great, but you need to check if the software you want to use works with ROCM. Also note there's only like 4 cards that work with ROCm as well.
Yeah I don't think 4090 is going down in price. As of now, they're more expensive than when they launched and it seems production is ramping down.