I'm a big fan of Jellyfin. I run it at home with a dedicated Nvidia A2000 for hardware transcoding. It's able to transcode multiple 4k streams with tonemapping faster than they can play.
As much as I'd love to use Jellyfin, there are two major issues: My internet connection is so slow, that I'd be lucky to stream 720p at a low bitrate. I'd spend the money on a faster connection, but I live in an area that doesn't even get cell phone service. My options are DSL and Starlink, and I have both; the DSL is just slow, and Starlink uplink speed isn't much better, plus I have plenty of obstructions that make it somewhat unreliable. The second problem is that Jellyfin has too steep of a learning curve. Telling my relatives "oh, if it starts buffering, just lower the bitrate" isn't an option. Not to mention, I'd have to run it on a VPS, and hosting a VPS with the resources required for this is way too expensive for me.
If any appliance manufacturer says that accessing your own appliance (that you own) outside their software ecosystem is financially "damaging" to them, they might as well be saying "Hey, just so you know, we're collecting and selling your data." If you have already purchased the appliance and their software is free, there is absolutely no other way that using a 3rd-party application could damage their bottom line.
Thanks, Haier, for letting me know never to purchase your products.
It really irritates me when IoT devices force you to use "the cloud" for access. My home automation consists of roughly 100 devices. The vast majority are Zigbee, but a few use wifi. With the exception of my irrigation controller, all the wifi devices are blocked at the firewall from accessing the internet. The fact that I have to send a command half way across the country to a remote server only so it can send it right back to my home network when I want to change the watering schedule for my plants is ridiculous. Sure, I could buy a different controller, but I already spent $300 once. I'm not doing it again.
I've swapped out at few of my Zigbee devices in the past, and even though I've deleted the original device, HA will add an "_2" to the entity ID, which breaks any automation that uses it, even if the friendly name remains the same. The only time I've seen this not happen is when a device drops off the network and I re-pair it. Is there a trick to making this work? Even if I don't switch to Z2MQTT, this would be really useful to know. I have a few unreliable cheap door sensors that I'd like to replace, but they're tied to so many automations that I've been dragging my feet on it.
When you talk about the limits of ZHA, what are you referring to exactly? It would probably take an entire weekend for me to re-pair all the devices on my Zigbee network, but I'm not completely opposed to the idea of I gain some functionality that I didn't have before.
Based on what I read when I first set up HA, it seems like ZHA was somewhat lacking for quite some time but is now essentially equivalent to Z2MQTT. I went with ZHA because it seemed like the "default" for Zigbee.
My personal opinion is that Docker just makes things more difficult. Containers are fantastic, and I use plenty of them, but Docker is just one way to implement containers, and a bad one. I have a server that runs Proxmox; if I need to set up a new service, I just spin up a LXC and install what I need to. It gives all the advantages of a full Linux installation without taking up the resources of a full-fledged OS. With Docker, I would need a VM running the docker host, then I'd have to install my docker containers inside this host, then forward any ports or resources between the hypervisor, docker host, and docker container.
I just don't get the use-case for Docker. As far as I can tell, all it does is add another layer of complexity between the host machine and the container.
.NET is infuriating enough on Windows. Any time I have to work with a .NET library, I always write a wrapper with a C or C++ interface first. Your friend who does .NET development on Linux has far more patience than I can ever hope to have.
RTX is one of those things that just isn't optional for me. I may be in the minority, but I am far more concerned with how games look than how they run. As long as my FPS is above 30 or so, I'm generally okay with performance. I feel like Windows will always support those "extra features" like RTX before Linux, unfortunately. I really comes down to market share, I think; the developers at Nvidia and AMD are going to target Windows first, and the people who maintain Proton are stuck in second place. You'll have to pry Windows 10 out of my cold dead hands, though; I liked Vista better than Windows 11.
For development, I'm locked into Windows at work, but my job isn't specifically software development; it just happens to be a useful skill to have in my career. I do far more coding at home, and I certainly have the option of switching to Linux. I think I've just been spoiled by Visual Studio's all-in-one approach for so long. My #1 concern is debugging. I haven't seen an Linux IDE that allows for stepping back through the call stack and checking variable states inside the IDE quite like VS does it.
To be clear, I'm not bashing Linux at all. I've been a homelabber for longer than I can remember, and I have a total of 3 physical machines and VMs that run Windows compared to a total of probably 20 that run Linux, FreeBSD, or some other POSIX variant. I have so few Windows machines that I actually own legal licenses for all of them. I do feel like the people who say "Just run Linux on your desktop PC; it can do everything Windows can" are looking at the operating system through rose-colored glasses. Linux will always be the best choice for anything that doesn't require having a monitor attached, but otherwise, it feels like it's playing catch-up to Windows.
I've tried switching to Linux exclusively multiple times, and I always end up falling back to Windows on my desktop. I have multiple Linux servers and VMs, but there are two main barriers. First is gaming. Last time I tried, I couldn't get RTX working in some titles, EA launcher was broken, and it was generally just buggy. The second reason is for coding. I've been coding for Windows for almost 20 years, and I am hugely reliant on Visual Studio. I just can't find a comparable alternative for Linux.
I'd ditch Windows in a second if I could make Linux work for me, but so far I haven't had much luck.
I don't think I was confused. My head unit falls into the category you mention as "some after-market head units come with Android installed." It runs a full-featured Android OS, and in addition, it allows me to connect my phone via Android Auto. My question was about the benefits of using Android Auto to connect my phone vs simply using the native Android OS on my head unit.
That makes a lot of sense; I would definitely be using it if I didn't have Android on the head unit. One note: I'm using bluetooth for Android Auto, and I'd do the same for tethering, so I wouldn't have to plug it into USB either way, except for charging.
The only person who should be able to "opt out" a child from vaccination should be a MD or DO, and they had better have a dammed good medical reason for it.
Mask mandates never should have been lifted in the first place. We already have to wear pants, a shirt, and some kind of footwear in public; practically, this doesn't really have a good reason other than cultural norms. Adding a mask to this is such a simple request, it blows my mind that people are so strongly against it; it's one of the few requirements for personal attire that actually has a good reason behind it, and it's incredibly easy to just put on a mask if you're going to be face-to-face with another person.
Human beings have developed the science and technology to grow crops to feed the population on a massive scale. In fact, growing plants takes a much lower energy input per output calorie than farming animals for meat. Meat production requires the production of plants first in order to feed the meat animals; it's extremely inefficient compared to producing plants for direct human consumption. Not only does a vegetarian diet reduce animal suffering, it's also a more efficient use of natural resources.
Predatory animals do not have this option. The owl that eats a mouse isn't doing it because he would rather eat a mouse than a soybean. He's doing it because eating a mouse is the only way he'll survive. Owls do not have farms, genetically modified crops, fertilizers, statistical analysis of crop yield, or any else of our agricultural advancements.
The fact of the matter is that human have no need to eat meat. We eat meat because we want to, not because it's necessary for our survival. If you choose to have a steak for dinner, you're making a decision that your desire for a specific flavor of food is more important than the suffering of a cow that provided the meat.
We are still evolving culturally, but we have moved past a lot of horrible things that we did throughout history. We can afford animals with the same right to life and happiness that we afford each other. The fact that so many people refuse to make even the smallest effort toward that goal is disgusting. "Eat something else" is such as simple request, yet the smallest inconvenience is just too much to handle. What does that say about our species?
Words like "murder" and "rape" only apply to non-human animals because for much of history, taking those actions on animals were necessary evils for us to survive. Our species has learned to evolve over time, and we no longer take many of the horrible actions that were commonplace centuries ago. We need to evolve as a culture away from eating meat, and our language needs to evolve with us.
I strongly disagree. On one side, people get to eat, yet conscious, feeling creatures are killed so that humans can eat their "preferred" source of food. On the other side, people still get to eat, animal suffering is greatly diminished, only some people may not enjoy their dinner quite as much.
I refuse to accept that the atrocities that are committed against what we call "meat animals" are worth it to satisfy someone's culinary preferences. You can get all the nutrition you need from plant-based sources.
I'm a big fan of Jellyfin. I run it at home with a dedicated Nvidia A2000 for hardware transcoding. It's able to transcode multiple 4k streams with tonemapping faster than they can play.
As much as I'd love to use Jellyfin, there are two major issues: My internet connection is so slow, that I'd be lucky to stream 720p at a low bitrate. I'd spend the money on a faster connection, but I live in an area that doesn't even get cell phone service. My options are DSL and Starlink, and I have both; the DSL is just slow, and Starlink uplink speed isn't much better, plus I have plenty of obstructions that make it somewhat unreliable. The second problem is that Jellyfin has too steep of a learning curve. Telling my relatives "oh, if it starts buffering, just lower the bitrate" isn't an option. Not to mention, I'd have to run it on a VPS, and hosting a VPS with the resources required for this is way too expensive for me.