this post was submitted on 05 Jun 2024
38 points (93.2% liked)
[Dormant] moved to !teslamotors@lemmy.zip
1754 readers
1 users here now
This community has moved to:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Tesla has a decent relationship with AMD though, right? Means nvidia in nice-to-have for them, but not neccessary.
How are AMD GPUs useful though? Last time I've heard, CUDA (and CuDNN) is still an Nvidia-only thing.
There are compatibility layers for cuda to run on AMD, and everything AI can also natively run on ROCm. It's a choice to use nvidia, not mandatory.
What is the best working compatibility layer to run cuda on AMD? ROCm seems to drop support pretty quickly after release though so it’s hard for it to get a foothold. As Karparhy has shown, doing low level C++ has some amazing results…