You always have the option of partitioning some free space on your hard drive and dual boot windows along side Linux Mint until you're 100% confident in erasing windows from the drive. When I first got started with it years ago I had similar fears that something would go wrong with the process or there would be driver issues or I wouldn't be able to start my favorite software. So I dual booted windows on my laptop for about a year until I realized I hadnt needed to use windows at all.
Hell yes! Scream it from rooftops with your menthol laced breath. The celestial calendar fortold that 2025 would truly be the year of the linux ™ the time is now!
It wasn't really my intent to complain but rather inform the OP with how I see Lemmy and which kinds of people make up a good portion of the overall community that contribute to conversations after being here for quite some time. I dont think I whinged or went on a opinionated rant that really catagorizes as complaining.
People are more genuinely interested in actually contributing to a conversation here and likely to read through your stuff/reply. I feel more seen.
Reddit is a generic corporate algoritm flavored slop with LLMs with an agenda talking to human morons somehow dumber and less aware than the LLMs. Lemmy is at least mostly human but has a personality archetype bias that takes getting used to. Even on niche communities here theres a high likelyhood you're talking to someone whos either a left leaning political activist, is really into alternative gender identity politics, knows a lot about information technology/STEM, has some serious kinky fetishes, is neurodivergent, or a mix of the above.
So you have the conversational pitfalls that come from talking to tech nerds, liberal arts students, the loud and proud members of lgbqt+, tankies, and all the in between relatively outcast groups that didn't fit well on reddit in the first place. Every 1/10 post on all is going to be about how fucked the climate change is, lgbqt rights, femboys, trump/elon/conservative republicans doing something stupid or evil or facist, a really unfunny 'meme' thats really about spreading some message or showcasing how victimized X minority group is, why linux is good and windows/microsoft bad, some half baked plan by young political activist who think they can overthrow a global corporatocracy with some clever cordinated consumer protesting. At least the content is overall consistent.
As someone who doesn't really identify with most of these im left feeling lemmy isn't for me sometimes but its a decent enough social outlet that I can tune out the stuff I don't care for while being involved with the niche communities im actually here to be part of.
Look being real they would get away with it no matter which decrepit old man was in office or what their politics are. America is a corporatocracy wearing the skin of democracy. When the IRS audited Microsoft for tax evasion, the IRS got sued and defunded through lobbying to the point of being forced to back off. Fucking Microsoft took down the IRS. The world has changed and our old institutions of power are waning.
Your primary gaming desktop gpu will be best bet for running models. First check your card for exact information more vram the better. Nvidia is preferred but AMD cards work.
First you can play with llamafiles to just get started no fuss no muss download them and follow the quickstart to run as app.
Once you get it running learn the ropes a little and want some more like better performance or latest models then you can spend some time installing and running kobold.cpp with cublas for nvidia or vulcan for amd to offload layers onto the GPU.
If you have linux you can boot into CLI environment to save some vram.
Connect with program using your phone pi or other PC through local IP and open port.
In theory you can use all your devices in distributed interfacing like exo.
First you need to get a program that reads and runs the models. If you are an absolute newbie who doesn't understand anything technical your best bet is llamafiles. They are extremely simple to run just download and follow the quickstart guide to start it like a application
They recommend llava model you can choose from several prepackaged ones. I like mistral models.
Then once you get into it and start wanting to run things more optimized and offloaded on a GPU you can spend a day trying to setup kobold.cpp.
They both start a local server you can point your phone or other computer on WiFi network to it with local ip address and port forward for access on phone data.
If you want a more roleplay based model with more creativity at the cost of other things you can try the arliai finetune of nemo.
If you want the model to remember long term you need to bump its context size up. You can trade GPU layers for context size or go down a quant or go to a smaller model like llama 8b.
Local LLMs aren't perfect but are getting more usable. There's abliterated models and uncensored fine tunes to choose from if you don't like your LLM rejecting your questions.
To see if it can do it and how accurate its general knowledge is compared to the real data. A locally hosted LLM doesnt leak private data to the internet.
Most webpages and reddit post in search results are themselves full of LLM generated slop now. At this stage of the internet if your gonna consume slop one way or the other it might as well be on your own terms by self hosting an open weights open license LLM that can directly retrieve information from fact databases like wolframalpha, Wikipedia, world factbook, ect through RAG. Its never going to be perfect but its getting better every year.
Heres some websites to help you find the lemmy communities that best fit as replacements for the subreddits you used to frequent:
https://sub.rehab/?visibleServices=lemmy
https://redditmigration.com/