Skip Navigation

User banner
RandomLegend [He/Him]
RandomLegend [He/Him] @ RandomLegend @lemmy.dbzer0.com
Posts
33
Comments
495
Joined
2 yr. ago

  • thanks for the concern but no worries, i did my fair share of optimization for my config and i believe i got everything out of it... i will 100% switch to AMD so my question basically just aims at: Can i sell my 3070 or do i have to keep it and put into a "server" on which i can run StableDiffusion and oobabooga because AMD is still too wonky for that...

    That's all. My decision is not depending on whether this AI stuff works, but it just accelerates it if AMD can run this, because i can sell my old card to get the money quicker.

  • insane you say? So it's much more sane to aim your PC optimization towards a config that only the top 5% use? So that 80% of the possible users cannot run it?... interesting definition of insanity you have there. Forsaking 80% of your possible target group, therefore missing out on a bunch of money, instead you put out some hot garbage that needs a PC with the cost of a small new car to be played, to still look like absolute shit

  • The good ol' anything v3 and DPM Karras 2m+

    that would give me a good baseline. Thanks! :)

  • Check out steam hardware survey, and target the most used config. That way you can make sure your game is enjoyable for the most players

  • That'd be awesome. No hurries though

  • This is correct, yes. But I want a new GPU because I want to get away from NVidia...

    i CAN use 13b models and I can create 1024x1024 but not without issues, not without making sure nothing else uses VRAM and I run out of memory quite often.

    I want to make it more stable. And open the door to use bigger models or make bigger images

  • That's outside the scope of this post and not the goal of it.

    I don't want to start troubleshooting my NVidia stable diffusion setup in a LLM post about AMD :D thanks for trying to help but this isn't the right place to do that

  • Thanks

    I was hoping to see a console output that shows me the iterations per second in dependence of the specific sampler. But I guess that suffices

    Thanks again!

  • Its not about the size... Its about how you use it... Or something like that

  • That's sadly not that descriptive. It depends on your iterations ... Can you tell me how many it/s you get with which sampler you use? That would make it much better comparable for me

  • Interesting

    Do you only use LLMs or also stable diffusion ?

  • I'm not asking about drivers or such. I'm asking about performance specifically in Oobabooga and/or StableDiffusion.

    Have you done anything with those?

  • Yeah that was what i was worried about after reading the article; I've heard about the different backends...

    Do you have AMD + Linux + Auto111 / Ooobabooga? Can you give me some real-life feedback? :D

  • I've had AMD cards my whole life and only switched to NVidia 3 years ago where that whole local LLM and ImageAI thing wasn't even on the table...now i am just pissed that NVidia gives us so little VRAM to play with unless you pay the same price as used car -.-

    AMD drivers are available from within the kernel so yeah, i won't do any downloading for AMD drivers on Linux^^

    Oobabooga and Automatic1111 are my main questions - i could actually live with a downgrade in terms of performance if i then atleast can run the bigger models due to having way more VRAM. Can't even run 17b models on my current 8GB VRAM card...can't even make 1024x1024 images on Auto1111 without getting Issues aswell. If i can do those things but a bit slower, thats fine for me^^

  • what models are you using and how many iterations /s do you get on average with them?

    Do you also use StableDiffusion (Auto1111)? If yes, same question as above for that^^

  • No worries

    Interesting article Never heard about SHARK, seems interesting then

  • I am on Linux, but I can live with a painful install. I wanted to hear if it performs on par with nvidia

  • Well that's the question...

    What you mean with "not needing ai"? I mean oobabooga and stable diffusion have AMD installers, and that's exactly what I am asking about. Therefore I post in community...

    To find out how good those AIs run on AMD