Skip Navigation

User banner
thickertoofan
Posts
9
Comments
48
Joined
4 mo. ago

  • ah, well the good thing is, someone reached out to me from piefed and im transferring my community there.

  • i know right. lemmy.ml has the same UI and im considering that to be my next go-to.

  • how'd the migration work?

  • NOO! I loved this place.

  • Good, amazing but I'm not a linux fanboy who will feel giddy for this. My friends would definitely press me over this. But yeah I'm happy

  • as i've read somewhere, finite state machines cannot be sentient, or "intelligent" as we expect them to be. An LLM can not learn new things once trained. I'm waiting for a new breakthrough in this field, to be fully convinced about getting replaced.

  • LocalLLaMA @sh.itjust.works

    LLMs and their efficiency, can they really replace humans?

    LocalLLaMA @sh.itjust.works

    8B Byte Latent Transformer model released by meta

    LocalLLaMA @sh.itjust.works

    Continuous Thought Machines

  • i'm not the smartest out there to explain it but it's like ...instead of floating point numbers as the weights, its just -1,0,1.

  • LocalLLaMA @sh.itjust.works

    Microsoft just released BitNet!

    Permanently Deleted

    Jump
  • Nice to know. Thanks.

  • Same, I have an HDD from 2012 which has my childhood memories. First thing I'm gonna do is to get it fixed from a reputed service when I start earning.

  • Ooof. 700mb discs

  • Permanently Deleted

    Jump
  • Whyyy???

  • Permanently Deleted

    Jump
  • Welcome here!

  • LocalLLaMA @sh.itjust.works

    Soon you will be able to run LLMs natively in docker

  • I think the bigger bottleneck is SLAM, running that is intensive, it wont directly run on video, and SLAM is tough i guess, reading the repo doesn't give any clues of it being able to run on CPU inference.

  • LocalLLaMA @sh.itjust.works

    SpatialLM, a 1B model capable of spatial identification, using 3d point cloud data. The video demo is amazing.

  • There is a repo they released.

  • It will, they have released a repo with code.

  • LocalLLaMA @sh.itjust.works

    Microsoft KBLAM

  • I mean I didn't see any alarming need of a Google doc alternative, so I might actually be under a rock

  • I am not a bot trust me.

  • LocalLLaMA @sh.itjust.works

    Loaded benchmark for 1-3-4-7b models?

    LocalLLaMA @sh.itjust.works

    Gemma 3 1B and 3B result on a "needle in a haystack" like test ran locally