Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)DI
Posts
0
Comments
94
Joined
2 yr. ago

  • I think at this point we are arguing belief.

    I actually work with this stuff daily and there is a number of 30B models that are exceeding chatGPT for specific tasks such as coding or content generation, especially when enhanced with a lora.

    airoboros-33b1gpt4-1.4.SuperHOT-8k for example comfortably outputs > 10 tokens/s on a 3090 and beats GPT-3.5 on writing stories, probably because it’s uncensored. It’s also got 8k context instead of 4.

    Several recent LLama 2 based models exceed chatgpt on coding and classification tasks and are approaching GPT4 territory. Google bard has already been clobbered into a pulp.

    The speed of advances is stunning.

    M- architecture macs can run large LLMs via llama.cpp because of unified memory interface - in fact a recent macbook air with 64GB can comfortably run most models just fine. Even notebook AMD GPUs with shared memory have started running generative AI in the last week.

    You can follow along at chat.lmsys.org. Open source LLMs are only a few months but have started encroaching on the proprietary leaders who have years of headstart

  • you are answering a question with a different question. LLMs don’t make pictures of your mom. And this particular question?. One that has roughly existed since Photoshop existed.

    It just gets easier every year. It was already easy. You could already pay someone 15 bucks on Fiver to do all of that, for years now.

    Nothing really new here.

    The technology is also easy. Matrix math. About as easy to ban as mp3 downloads. Never stopped anyone. It’s progress. You are a medieval knight asking to put gunpowder back into the box, but it’s clear it cannot be put back - it is already illegal to make non consensual imagery just as it is illegal to copy books. And yet printers exist and photocopiers exist.

    Let me be very clear - accepting the reality that the technology is out there, it’s basic, easy to replicate and on a million computers now is not disrespectful to victims of no consensual imagery.

    You may not want to hear it, but just like with encryption, the only other choice society has is full surveillance of every computer to prevent people from doing “bad things”. everything you complain about is already illegal and has already been possible - it just gets cheaper every year. What you want to have protection from is technological progress because society sucks at dealing with the consequences of it.

    To be perfectly blunt, you don’t need to train any generative AI model for powerful deepfakes. You can use technology like Roop and Controlnet to synthesize any face on any image from a singe photograph. Training not necessary.

    When you look at it that way, what point is there to try to legislate training with these arguments? None.

  • Cost reduction in the field is orders of magnitude potential. Look at llama running on everything down to a raspy pi after 2 months.

    There are massive gains to be made - once we have dedicated hardware for transformers, that’s orders of magnitude more.

    See your phone being able to playback 24h of video but die after 3h of browsing? Dedicated hardware codec support