Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)SI
Posts
1
Comments
162
Joined
2 yr. ago

  • Oh man, I almost shot myself in my foot here :) I saw an announcement about it a year ago, and wanted to throw you a link. However, I simply could not recall name, link or where I saw it. The official site didn't mention it, so it took me 30 minutes to find this: https://github.com/SolidOS/solidos (Yeah!! I feel like Lemmy Hero right now :-D )

    I wanted to try it out and integrate the SOLID login system in Guix, but unfortunately got caught up with something else. If you decide to play around, I would be very interested in hearing about your experiences. Cheers bro..

  • I absolutely love it, and I'm never going back to an ordinary distribution again. I do fine regarding software. I use standard channels, non-free channel, flatpaks and a few appimages. I can't think of anything i'm missing atmo..

  • Just a thought, but in a few years all old ugly photos can be refined, upscaled, content edited, rotated and animated - in 32K ultra. It could even recognize the exact mobile model a random photo was taken with and pre-set the best filters.

    It prolly won't matter much if the photo is taken with a hundred year old handheld plate camera or a brand new digital mounted one - it will look great regardless.

    Are you sure photo hardware is the way to go ? I think I would just use whatever you already have and upgrade the pictures later when the software allows it.

  • If low on hw then look into petals or the kobold horde frameworks. Both share models in a p2p fashion afaik.

    Petals at least, lets you create private networks, so you could host some of a model on your 24/7 server, some on your laptop CPU and the rest on your laptop GPU - as an example.

    Haven't tried tho, so good luck ;)

  • I don't think you have any clue what the military power of the Russians are. I bet you've watched your telly and just trust whatever you are told by western 'analysts'. Currently, Russian military is the strongest military in the world, and are winning against the third Nato backed army. Deal with it. Watch 'The Duran' on YT, or follow some real news channels on Telegram..

  • I was unaware that the smaller context models exhibited the same effect. It does seem logical that broad important information and conclusions is naturally put at the ends of a sentence by us. I haven't read the paper yet, but wonder if the training set - our communication - also contains more information at the ends, so the effect isn't caused by the algorithm, but by the data. I'll give the paper a read, thx..

  • Haven't tried it (and don't use docker), so a wild shot: https://github.com/jupyterhub/repo2docker

    'repo2docker fetches a repository (from GitHub, GitLab, Zenodo, Figshare, Dataverse installations, a Git repository or a local directory) and builds a container image in which the code can be executed. The image build process is based on the configuration files found in the repository.'

    That way you can perhaps just delete the docker image and everything is gone. Doesn't seem to depend on jupyter..

  • No experience, but just adding that long context models have a tendency of 'forgetting' whats in the middle of the text. Worth noting if you work on long texts I assume. I can't remember the paper tho. There's so many..