Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)TH
Posts
2
Comments
134
Joined
2 yr. ago

  • I, and a lot of other fedizens, don't want the type of people who use BlueSky and Facebook in my ActivityPub fediverse.

    Fedi is cool because it's small, weird, and highly enriched with people I'm actually interested in communicating with. I don't need or want the pool to be diluted with an ocean of piss.

  • I only would agree with defederating if it's proven that they are blocking anti-genocide content on other servers from being visible to their users.

    In general I believe that maintaining lines of communication with those who otherwise believe in reprehensible ideologies is best, both in order to better understand such backwards ideologies, and also to provide a lifeline back to consensus reality for people who've been swept up by them.

    I also believe that in the majority of cases, users should wield the power to instance-block for themselves. Or at least, servers where this is the case are the ones I prefer to participate on.

    However, if it becomes clear that feddit.org admins are censoring content external to their instance due to their support of Zionism, then my first objection becomes irrelevant, and the second would be questionable, given the efforts of admins to put external fedi content thru a pro-genocide filter. At that point, defederation would be both warranted and wise.

  • Currently in a holding pattern because, while I got RAM & SSD for a new-to-me "1-liter" server before tariffs hit, I don't have the server itself nor any money to buy one, despite looking for 9th or 10th gen Intel which will cost me only $120 to $150 barebones.

    Money to buy one is not coming in because the place where I live has nonstop noise & activity and I don't have a separate room or any door I can close, which severely limits my ability to work as I have auditory hypersensitivity and an absolute need for solitude in order to recharge enough to think. 🤷🏻

  • One thing I really like about the author's fedi coverage, he doesn't kowtow to rank stupidity and avoid mentioning alternatives because of hivemind disapproval.

    From a publisher (user) perspective, Nostr's censorship resistance by design blows the doors off of what ActivityPub can claim in that regard. And I say that as someone who's been pretty much all-in on ActivityPub fedi since its inception (almost a decade, wow)

  • everybody downvoting your comment has zero experience being the go-to family tech guy for relatives in their 80s and 90s who can't reliably distinguish between windows, dialog boxes, menus, and buttons

  • Don't view keys in Monero "expose information to the authorities" if used in such a way? Hard to see the need for a system where tokens can "expire" when we already have a perfectly good one where they don't.

  • Ah okay, this is what I was thinking of.

    Sounds like Peertube admins can also proactively choose to mirror videos from other instances too?

    Very nice. Would be cool to get some kind of massive user influx and see how well it does under load.

  • The text here is just copypasta about what Peertube is, not about the so-called v1 app.

    More useful: https://joinpeertube.org/news/app-v1

    Unfortunately, this reveals that one apparently still can't upload or manage a channel from the mobile app 🤦🏻

  • But what makes this AI model unique is that it’s lightweight enough to work efficiently on a CPU, with TechCrunch saying an Apple M2 chip can run it.

    An Apple M2 can run bigger, higher-precision models than this FWIW. More important than this is perhaps whether older CPUs can run it with acceptable performance.

    AI models are often criticized for taking too much energy to train and operate. But lightweight LLMs, such as BitNet b1.58 2B4T, could help us run AI models locally on less powerful hardware. This could reduce our dependence on massive data centers and even give people without access to the latest processors with built-in NPUs and the most powerful GPUs to use artificial intelligence.

    This is definitely relevant to my interests especially with NPU support for such models coming. Dirt cheap ARM-based PCs based on e.g. the RK3588 are shipping with small NPUs