Smaug-72B-v0.1: The New Open-Source LLM Roaring to the Top of the Leaderboard
ArchAengelus @ ArchAengelus @lemmy.dbzer0.com Posts 0Comments 67Joined 2 yr. ago
ArchAengelus @ ArchAengelus @lemmy.dbzer0.com
Posts
0
Comments
67
Joined
2 yr. ago
Unless you’re getting used datacenter grade hardware for next to free, I doubt this. You need 130 gb of VRAM on your GPUs