Cloud AI services will always be more capable. You can do more with near-limitless RAM and processing power than you'll ever be able to do locally on your mobile/desktop system.
But locally-run AI's usefulness was never in question either.
An example of this is Google who made the Coral TPU and then the Pixel TPU chips specifically in a bid to bring AI to edge and mobile devices. And now they have been developing a version of their Bard AI that is built to run on mobile systems.
There is a huge market for it but there are a lot of challenges to running AI models on lower power hardware that have to be addressed. In its current state most AI on these platforms performed only get specific operations using very condensed models.
Tailscale has long supported DNS addresses that link to your tailnet. Typically they only accept connections from addresses allowed within your tailnet, but there isn't anything particularly complex about how funnel allows any incoming address.
Further, like most of tailscale's operations, funnel isn't requiring them to host or even proxy any significant amount of data, it's just directing incoming connections on that domain to a device on your tailnet.
The hosting cost to tailscale is insignificant and really no different than what they do on a basic tailnet.
I don't think it will become a paid only option and I don't think it's too beta to use for a home server.
Personally I don't bother using it because I'm comfortable exposing my IP address and opening a port to my home server using direct DNS.
But there are some advantages to using tailscale funnel in that your ip will be obfuscated and the traffic will be routed through WireGuard so potentially more secure.
A whitish-gray, cartoon-styled human skull with large, black eye sockets. Commonly expresses figurative death, e.g., dying from extreme laughter, frustration, or affection.
Popular around Halloween. Not to be confused with β οΈ Skull and Crossbones, though their applications may overlap.
Skull was approved as part of Unicode 6.0 in 2010 and added to Emoji 1.0 in 2015.
Squoosh (Image converter, runs fully in browser but I like hosting it anyway)
Paperless-ng (Document management)
CryptPad (Secure E2EE office colaboration)
Immich (Google Photos replacement)
Audiobookplayer (Audiobook player)
Calibre (Ebook management)
NextCloud (Don't honestly use this one much these days)
VaultWarden (Password/2FA/PassKey management)
Memos (Like Google Keep)
typehere (A simple scratchpad that stores in browser memory)
librechat (Kind of like chatgpt except self-hosted and able to use your own models/api keys)
Stable Diffusion (AI image generator)
JellyFin (Video streaming)
Matrix (E2EE Secure Chat provider)
IRC (oldschool chat service)
FireFlyIII (finance management)
ActualBudget (another finance thing)
TimeTagger (Time tracking/invoicing)
Firefox Sync (Use my own server to handle syncing between browsers)
LibreSpeed (A few instances, to speed testing my connection to the servers)
Probably others I can't think of right now
Most of these I use at least regularly, quite a few I use constantly.
I can't imagine living without Searxng, VaultWarden, Immich, JellyFin, and CryptPad.
I also wouldn't want to go back to using the free ad-supported services out there for things like memos, kutt, and lenpaste.
Also librechat I think is underappreciated. Even just using it for GPT with an api key is infinitely better for your privacy than using the free chatgpt service that collects/owns all your data.
But it's also great for using gpt4 to generate an image prompt, sending it through a prompt refiner, and then sending it to Stable Diffusion to generate an image, all via a single self-hosted interface.
I think people would treat a recurring payment they can depend on every week/month differently than a one-time thing that only happened in the middle of a pandemic.
Yeah here FedEx isn't really a delivery company, at least for residential addresses..
All they do is leave a note without knocking telling you to come get your shit. I'm 90% sure their delivery drivers never even had my package in their truck to begin with
The true magic of QMK is that you can flash the programming onto the device and all of these features work independent of the system it's connected to.
Often keyboards are advertised as having much of the same customizability but it depends on companion software to work.
You're right, for some reason I thought Firebase was allowed.
Yeah netfy is a FOSS notification service.
As to drop-in replacements, I don't think such a thing really exists on the user side, this is fully up to the app developer in how they want to implement notifications.
To use netfy instead of FCM your app would need to be designed to do so or support it as an alternative option.
Cloud AI services will always be more capable. You can do more with near-limitless RAM and processing power than you'll ever be able to do locally on your mobile/desktop system.
But locally-run AI's usefulness was never in question either.
An example of this is Google who made the Coral TPU and then the Pixel TPU chips specifically in a bid to bring AI to edge and mobile devices. And now they have been developing a version of their Bard AI that is built to run on mobile systems.
There is a huge market for it but there are a lot of challenges to running AI models on lower power hardware that have to be addressed. In its current state most AI on these platforms performed only get specific operations using very condensed models.