thanks for the concern but no worries, i did my fair share of optimization for my config and i believe i got everything out of it... i will 100% switch to AMD so my question basically just aims at: Can i sell my 3070 or do i have to keep it and put into a "server" on which i can run StableDiffusion and oobabooga because AMD is still too wonky for that...
That's all. My decision is not depending on whether this AI stuff works, but it just accelerates it if AMD can run this, because i can sell my old card to get the money quicker.
insane you say? So it's much more sane to aim your PC optimization towards a config that only the top 5% use? So that 80% of the possible users cannot run it?... interesting definition of insanity you have there. Forsaking 80% of your possible target group, therefore missing out on a bunch of money, instead you put out some hot garbage that needs a PC with the cost of a small new car to be played, to still look like absolute shit
This is correct, yes. But I want a new GPU because I want to get away from NVidia...
i CAN use 13b models and I can create 1024x1024 but not without issues, not without making sure nothing else uses VRAM and I run out of memory quite often.
I want to make it more stable. And open the door to use bigger models or make bigger images
That's outside the scope of this post and not the goal of it.
I don't want to start troubleshooting my NVidia stable diffusion setup in a LLM post about AMD :D thanks for trying to help but this isn't the right place to do that
That's sadly not that descriptive. It depends on your iterations ... Can you tell me how many it/s you get with which sampler you use? That would make it much better comparable for me
I've had AMD cards my whole life and only switched to NVidia 3 years ago where that whole local LLM and ImageAI thing wasn't even on the table...now i am just pissed that NVidia gives us so little VRAM to play with unless you pay the same price as used car -.-
AMD drivers are available from within the kernel so yeah, i won't do any downloading for AMD drivers on Linux^^
Oobabooga and Automatic1111 are my main questions - i could actually live with a downgrade in terms of performance if i then atleast can run the bigger models due to having way more VRAM. Can't even run 17b models on my current 8GB VRAM card...can't even make 1024x1024 images on Auto1111 without getting Issues aswell. If i can do those things but a bit slower, thats fine for me^^
What you mean with "not needing ai"? I mean oobabooga and stable diffusion have AMD installers, and that's exactly what I am asking about. Therefore I post in community...
thanks for the concern but no worries, i did my fair share of optimization for my config and i believe i got everything out of it... i will 100% switch to AMD so my question basically just aims at: Can i sell my 3070 or do i have to keep it and put into a "server" on which i can run StableDiffusion and oobabooga because AMD is still too wonky for that...
That's all. My decision is not depending on whether this AI stuff works, but it just accelerates it if AMD can run this, because i can sell my old card to get the money quicker.