this post was submitted on 05 Jun 2024
38 points (93.2% liked)

[Dormant] moved to !teslamotors@lemmy.zip

1754 readers
1 users here now

This community has moved to:

!teslamotors@lemmy.zip

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ichbinjasokreativ@lemmy.world 1 points 1 year ago (1 children)

Tesla has a decent relationship with AMD though, right? Means nvidia in nice-to-have for them, but not neccessary.

[–] Endmaker@lemmy.world 4 points 1 year ago (1 children)

How are AMD GPUs useful though? Last time I've heard, CUDA (and CuDNN) is still an Nvidia-only thing.

[–] ichbinjasokreativ@lemmy.world 7 points 1 year ago (2 children)

There are compatibility layers for cuda to run on AMD, and everything AI can also natively run on ROCm. It's a choice to use nvidia, not mandatory.

[–] Endmaker@lemmy.world 3 points 1 year ago
[–] notfromhere@lemmy.ml 2 points 1 year ago

What is the best working compatibility layer to run cuda on AMD? ROCm seems to drop support pretty quickly after release though so it’s hard for it to get a foothold. As Karparhy has shown, doing low level C++ has some amazing results…