Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)KT
Posts
0
Comments
246
Joined
2 yr. ago

  • Because Congress creates really, really complicated tax laws that reduce tax burdens for business losses and obviously favor a structure that if you're wealthy enough you can exploit it for things that aren't actually contributing to our economy. Regular person with a few deductions that might go over a standard deduction? You might want some help with that, even if you're "doing it yourself" with the help of software (just like almost all of tax preparers do) or if you're just doing a standard deduction, it is easy.

    Why can't you just fill out a 1040EZ in a webform and file online through the IRS?

    Companies like Intuit lobby Congress so that the IRS can't just make it easy for most people. For example, the IRS wanted to just have their own app on a website to help you do your taxes for free just like TurboTax but nope.

    As an example, you can specifically blame Intuit as an example that has actively tried to limit the IRS making tax filing free and easy for most Americans.

    The same thing has happened with weather in the US. NOAA/NWS pretty much provides all weather data and RADAR data to the public for free. They also provide a public API and all those weather apps either directly or indirectly get their data from there for the US.

    Why don't they have an app like all these weather services do? Well companies lobbied so NWS isn't allowed to. The best they can do is provide a website that you can "use like an app." Paying extra for that up-to-date RADAR feed in your weather app? Guess what? You already did through taxes and now you get to pay a private company to simply display it in their interface.

    You can specifically blame AccuWeather as an example of a company that actively has tried to limit the NWS and actually lock the public out of the data that they paid for through taxes.

    Congress has the power to end this non-sense but does not. I'm sure this won't surprise anyone when I say this but US law is slanted towards empowering businesses and lowering accountability.

  • EDIT: Sorry, I replied to the wrong reply here. However, if you're interested in these exemptions, you can read through them.

    https://www.federalregister.gov/documents/2021/10/28/2021-23311/exemption-to-prohibition-on-circumvention-of-copyright-protection-systems-for-access-control

    /EDIT

    Audiovisual works fall under Section III (1) of the exemptions, including video.

    Computer Programs fall under Section III (5 - 12).

    Explicitly software isn't even mentioned until Section III (6) though E-Books falls under Section III (1D) which is obviously software.

    Under PROPOSED CLASS 1: AUDIOVISUAL WORKS---CRITICISM AND COMMENT, DVD CCA, screen captures and viewing the media in a classroom as part of the concerns of those that were against the exemption in Section III (1) specifically.

    These exemptions definitely do not just encompass software only.

    The DMCA and copyright law does not allow any of this without the exemptions. These are exemptions to those laws and what is contained in them is legally allowed.

  • I think what we've been seeing since 2022 is that there is a line that was crossed that energized young adults to actually vote. Historically, this is the demographic that doesn't vote. The GOP have been losing in state and federal elections because of it.

    I personally believe that if we see more hardline far-right GOP fill seats, they will fill less and less seats. The GOP will then have to split into two basically to survive. Then, I'd imagine, we'd see the far-right die off.

    We saw something similar, though more minor and swifter when this happened when the GOP started to split into the 'Tea Party'. This would probably be a bigger culling that will last a lot longer.

    If it doesn't happen that way, the GOP just won't survive as a major party.

    In the US, registered voters for the DNC only account for 27% exactly the same for the RNC at 27% with the remaining 45% as registered Independent. This is in contrast to 2004 where only 27% of the registered voters were Independent. Source (Gallup).

    At the best outcome, hopefully, this will introduce other parties that will also be considered 'major' parties and have a chance on ballots all over the country. This could be a catalyst for real change in the US, if the GOP can't get it's shit together.

  • Trump's lawyers: Banks should have done their own due diligence for his properties themselves. We even put a disclaimer to not trust us!

    Also Trump's lawyers: Failed doing due diligence on Trump's properties in court.

    I wonder if their slides had a disclaimer to not trust their evidence.

  • In Boxes, power down your XP VM, click Settings - Sharing Panel - Enable Sharing toggle. Click File Sharing and enable File Sharing. Power on the VM.

    At that point you should be able to drag and drop from your host direct into your VM for a file transfer.

    You can also click the vertical dots menu in the Guest's console "screen" and click Send File... menu option.

    In the same menu you can click Devices & Shares - Realtek USB or whatever - Local Folder - Select from the dropdown for the Host's folder that you'd like to share - Save - Make sure Toggle on the right is on.

    Then your folder, I believe in XP, will show up as a removable drive like a USB drive would.

  • Gnome's Boxes is pretty easy to use and of course uses qemu + KVM. This would be a type 1 hypervisor vs. Virtualbox's type 2. It is point and click like Virtualbox. You don't need to use Gnome's DE to use Boxes.

    I have seen people post about your specific error for years when using the virtualbox website's repository instead of their own distro's repository (if it exists).

  • Back when I was a hardware engineer (embedded hardware, not really part of IT) for avionics, most of what I'd see where the interfaces weren't standard inside 'black boxes' were really just PCs on a motherboard with a 'bus controller' (not really a bus controller) that could be slotted into a PCI. You just have to pass the PCI from the hypervisor to the VM where the drivers and OS that uses it sits.

    An issue that hangs some people up is some hardware that required an RTOS and was being virtualized is the CPU scheduler (due to vCores/HT/SMT) but those didn't run on Windows of course. My solution is just pinning the physical CPU and every odd core (if I can't just turn HT/SMT off) to the VMs with an RTOS. Works great.

    Most data connections are just serial types and the Data|TX -/+ or TX|RX are simply swapped in the pin-out with a 'proprietary' formfactor that's easy to pigtail into whatever.

    Maybe I should just go into business modernizing old lab and factory equipment's compute.

  • I actually posted that in sciencememes a few days ago including other solutions as well as hardware passthru. People kept replying that it wasn't a solution because alternatively the lab doesn't have the expertise and somehow after 2 decades the only solution available is to continue to fight a losing battle of maintaining with no longer made hardware and also that source code availability would somehow just magically be maintained by magic software developers also interested in it after all of this time.

    There's more goal post moving and some stretching assumptions in the responses but that's the ultimate gist.

    It isn't that I'm again code rights dying with a vendor or even source code availability but I was merely posting that these types of problems are too common and solvable already outside of severe edge cases.

  • Cause the instrument is important and replacing it, aside from being a massive waste of a perfectly functioning instrument, costs hundreds of thousands if not millions of € that we can’t spend

    Why would you need to replace the instrument? You only need to replace the computers' functions. Why does it need to cost anything other than some other old workstation tossed into an ewaste bin years ago?

    some dude on Lemmy said we shouldn’t use stop-gap measures for a problem that’s completely artificial.

    As opposed to some dude on Lemmy bemoaning that there just can't be solved without source even though I've given actual solutions available now and for little to no material cost?

    You have admitted that you'd still have to rely on someone else's expertise and motivation in the hopes that they'd solve the problem for the lab, yet, in my opinion, you're just discarding solutions that I've presented as if they aren't solutions at all because, at least in one of your points, that they'd have to rely on someone else's expertise and motivation in the hopes that they'd solve the problem for the lab. Even then, as I said, they've had decades to figure it out and there exist step-by-step instructions already that are freely available to help them solve the problem or get them almost to the end, assuming, there is some proprietary hardware never mentioned.

    Anyway, I don't really have anything else to add to the conversation. So you can have the last word, if you wish.

  • Secondly, you’ve combined app categories that don’t fit. Google+ was a social network, Hangouts was a chat app

    Hangouts was originally part of Google+, hence "Google Hangouts (2013), which was part of Google+ (2011)"

    If you don't recall it as a feature within G+, then at least trust an article talking about it.

    Hangouts was third, a real-time video chat product embedded in Google+.

    The Verge (2013) EXCLUSIVE: INSIDE HANGOUTS, GOOGLE'S BIG FIX FOR ITS MESSAGING MESS

    Finally, you’ve conflated technologies. Android Automotive OS is an entire OS running in a car that is maintained by the OEM in much the same way as Android is on phones.
    ...Incidentally, this has nothing to do with Android Auto, which is an extended display for your phone.

    I mention both as they are intended to provide the same functionality, regardless of the underlying technology -- integration of a vehicle's Infotainment with a Google provided ecosystem. In-fact, Android Auto apps are compatible with Android Automotive, because, technical 'why' aside, the function to the end-user is the same.

    Google has been around for 25 years and always has chased innovation. They create a ton of things, see what sticks, then iterate or pivot.

    According to many Googlers over the years, the reason many of these projects eventually discontinue and fail isn't because things 'aren't sticking' but rather due to the internal culture, in that to set yourself apart and get good performance ratings, you must always strive to be on teams that are doing something new. This leaves little to no resources for maintaining the 'old' regardless of how much people like them (or not).

    While I too have been frustrated by the discontinuation of service I liked

    I don't know about everyone else, but I wrote what I wrote, not because I'm frustrated about a discontinuation of any service I liked from Google. That happens. It is because the branding and evolution of products are confusing and sometimes, they even coexist. From my perspective, it often seems as if there is no actual long-term plan or guidance for many services that have come and gone with no signs of that changing.

    The perception of the chaotic mess that Google brings with many of its services past, present and probably the future is at least something that I felt I wanted to criticize. They deserve it regardless of the supposed intentions behind the curtain.

    Whenever I hear this kind of complaint, it sounds to me that people just want Google to be more like Apple or Microsoft and churn out minor improvements to their existing money makers with minimal innovation.

    That's your opinion I suppose but it is not mine. My opinion is that Google should at least change the perception of their products to have clear and clean plans as they evolve. This would give me a reason to trust their branding more.

    You mentioned Duo and Allo, which co-existed along with Hangouts for a time. The utter confusion and lack of interoperability created a confusing schism within the same userbase that used them at the time. You could argue that somehow they 'innovated' chat and video conferencing but they didn't even call one something like Hangouts Chat and Hangouts Video when they segregated the functions with a clear passover from Hangouts itself.

    I think people would just prefer Google appears to be less arbitrary and in disarray about their products. If we are to believe some of the people that actually worked on these products, then that is going to require a culture change within.

  • So again and again and again, I was not arguing against the abandonware issue. I take issue with how the problem is being stop-gapped in this current situation and not in some hypothetical alternate timeline.

    Instruments like the ones we use are super expensive

    Great. I didn't imply otherwise.

    On top of that most people here barely understand computer and software

    So the lab guy maintaining Windows 95 era computer's hardware, barely understands computers. Got it. I suppose this same lab guy won't be able to do anything even if the source code was available and would still being doing the same job.

    What you’re suggesting is treating the symptoms but not the disease. Making certain file formats compatible with other programs is not an easy undertaking and certainly not for people without IT experience.

    I didn't say it isn't. I said they've had 20 years to figure it out. What would source code being available solve for them then? We could assume other people would come together to maintain it, sure. I've also talked about other solutions in replies. There are even more solutions. I wasn't trying to cover all bases there. It is just that within a couple of decades this has been a problem, there has been plenty of time to solve it.

    Software for tools this expensive should either be open source from the get-go or immediately open-sourced as soon as it’s abandoned or company goes bust

    Oh OK, so that makes it less complicated. I thought the assumption here is that, in general, anyone in that lab barely understands a computer or how software works. So, who's going to maintain it? Hopefully, others, sure. I actually do talk about this in other replies and how it is something I support and that, in this case, the solution is to deliver the source with the product. FOSS is fantastic. Why can't that just be done now by these same interested parties? Or are we back to "can't computer" again? Then what good is the source code anyway?

    But again, that's a "what-if things were different" which isn't what I was discussing. I was discussing this specific, real and fairly common issue of attempting to maintain EOL/EOSL hardware. It is a losing game and eventually, it just isn't going to work anymore.

    Even with plenty of funding to workaround the issue that shouldn’t be necessary, it’s a waste of time and money just so a greedy company can make a few extra bucks.

    Alright, the source code is available for this person. Let's just say that. What now?

    What can be done right now, is fairly straight forward and there are numerous step-by-step guides. That's to virtualize the environment. There is also an option to use hardware passthru, if there is some unmentioned piece of equipment. This could be done with some old laptop or computer that you've probably tossed in the dumpster 10 years ago. The cost is likely just some labor. Perhaps that same lab guy can poke around or if they're at a university, have their department reach out to the Computer Science or other IT related teaching department and ask if there are any volunteers, even for undergrads. There are very likely students that would want to take it on, just because they want to figure it out and nothing else.

    There may be an edge case where it won't work due to some embedded proprietary hardware but that's yet another hypothetical issue at stake which is to open source hardware. That's great. Who's going to make that work in a modern motherboard? The person that you've supposed can't do that because they barely understand a computer at all?

    In this current reality, with the specific part of the post I am addressing, the solution currently of sustaining something ancient with diminishing supply is definitely not the answer. That is the point I was making. There is a potential of 20 years of labor hours. There is a potential of 20 years of portioning of budgets. And let's not forget, according to them, it is "CRITICAL" to their operations. Yet, it is maintained by a "lab guy" who may or may not have anything other than a basic understanding of computers using hardware that's no longer made and hoping to cannibalize, use second hand and find in bins somewhere.

    If this "lab guy" isn't up to the task, then why are they entrusted with something so critical with nothing done about it in approximately two decades? If they are up to the task, then why isn't a solution with longevity and real risk mitigation being taken on? It is a short-sighted mentality to just kick it down the road over and over again plainly hoping something critical is never lost.

  • or even basic product management.

    Googe Wallet (2011) became Android Pay (2015) became Google Pay (2018) became Google Wallet (2022), except in some places. Also, except in the US (and maybe elsewhere?) where Google Pay is still around but just to send money between people.

    Google Talk (2005) and Google+ Messenger (2011) sort of became Google Hangouts (2013), which was part of Google+ (2011) which became Hangouts (2013), which became both Duo (2016) and Allo (2016) but then during both Duo and Allo became Hangouts Meet (2019) and Hangouts Chat (2019) which became Google Meet (2017 -- Yes, Hangouts Meet was still around) and Google Chat (2017 -- Yes, Hangouts Chat was still around). Google Allo died in 2018 and Duo died in 2022.

    Inbox (2015) became a better gmail Android app than gmail actually was. Inbox discontinued in 2019 with the advertisement that gmail integrated Inbox's features (it didn't add most of them). This spawned other 3rd party gmail handling apps to take its place.

    Google Play Music (2011) podcasts split into Google Podcasts (2018) stopped having releases in 2021 and rolled up/is rolling up into YouTube Music (2015). Google Play Music became YouTube Music in 2020.

    Right now there's even Android Auto and Android Automotive simultaneously to pretty much do the same thing but are not the same. Android Automotive itself exists as Android Automotive with Google Automotive Services and also as Android Automotive without Google Automotive Services.

    Android Auto for Phone Screens was replaced with Google Assistant's driving mode.

    There are many, many, many more crazy branding issues but I just don't feel like continuing. Google has also killed at least 54 hardware lines, 59 apps and 210 services.

  • It isn't necessarily a computer programming problem either. Rather it is an IT problem at least in part, one that the poster states is the primary job of his 'lab guy' -- to maintain two ancient Windows 95 computers specifically. That person must know enough to sustain the troubleshooting and replacement of the hardware and certainly at least the transfer of data from the own spinning hard drives. Why not instead put that technical expertise into actually solving the problem long-term? Why not just run both in qemu and use hardware passthru if required? At least then, you would rid yourself of the ticking time-bomb of hardware and its diminishing availability. That RAM that is no longer made isn't going to last forever. They don't even need to know much about how it all works. There are guides, even for Windows 95 available.

    Perhaps there are other hurdles such as running something on ISA but even so, eventually it isn't going to matter. Primarily, it seems rather the hurdle is specifically the software and the data it facilitates though. Does it really have some sort of ancient hardware dependency? Maybe. But in all that time of this 'lab guy' who's main role is just these two machines must have some time to experiment and figure this out. The data must be copyable, even as a straight hard drive image even if it isn't a flat file (extremely doubtful but it doesn't matter). I mean the data is by the author's own emphasis CRITICAL.

    If it is CRITICAL then why don't they give it that priority, even to the lone 'lab guy' that's acting IT?

    Unless there's some big edge case here that just isn't simply said and there is something above and beyond simply just the software they speak about, I feel like I've put more effort into typing these responses than it would take to effectively solve the hardware on life support side of it. Solving the software dependency side? Depending on how the datasets are logically stored it may require a software developer but it also may not. However, simply virtualizing the environment would solve many, if not all, of these problems with minimal investment, especially to CRITICAL (their emphasis) data with 20 years to figure it out. It would simply be a new computer and some sort of media to install Linux or BSD on and perhaps a COTS converter if it is using something like an LPT interface or even a DB9/DE-9 D-Sub (though you can still find modern motherboards, cards or even laptops capable of supporting those but also certainly a cheap USB adapter as well).

    Anyway, I'm just going to leave it at that, I think I've said a lot on the subject to numerous people and do not have much more to add other than this is most likely solvable and outside of severe edge cases, solvable without expert knowledge considering the timeframe.

  • Well, I think a better solution would be to deliver all source code with the compiled software as well. I suppose that would extend to the operating system itself and the hope that there'd be enough motivation for skillful folks to maintain that OS and support for new hardware. Great, that would indeed solve the problem and is a potential outcome if digital rights are overhauled. This is something I fully support.

    What is stopping them now from solving access to this data, even if it's in a proprietary format?

    Really, again, I don't take issue with the abandonware argument but rather with the situation that I posted itself. Source code availability and the rights surrounding are only one part of the larger problem in the post.

    Source code and the rights to it, aren't the root cause of the problem in the post that I was regarding. It could facilitate a solution, sure but given that there is at least 20 years of data at risk currently, there was also 20 years of potential labor hours to solve it. Yet, instead, they chose to 'solve' it in a terrible way. That is what I take issue with.

  • I didn't say capitalism is perfect nor did I imply it.

    So hypothetically let's say the vendor lost the rights to the software since it is abandonware -- great. I'd love it.

    What changes for justmeremember's situation? Nothing changes.

    I suppose your only issue here is that the software vendor or some entity should support it forever. OK, so why didn't they just choose a FOSS alternative or make one themselves? If not then, why not now? There is nothing that stops them from the latter other than time and effort. Even better, everyone else could benefit!

    Does that make justmeremember just as culpable here or are they still the victim with no reasonable way to a solution?

    I posted simply because this specific issue is much too common and also just as common is the failure to actually solve it regardless of the abandonware argument instead of stop-gapping and kicking it down the line until access to the data is gone forever.