Amazon builds AI model to optimize packaging
CalcProgrammer1 @ CalcProgrammer1 @lemmy.ml Posts 0Comments 452Joined 4 yr. ago

I don't really see why they would hire him to achieve this goal. He had already quit as maintainer. He was out of the picture unless he resigned specifically due to accepting an offer from NVIDIA, but if that was the case and they wanted Nouveau stopped then why is he now contributing a huge patchset? If they hired him and he quit nouveau they could've had him work on the proprietary driver or their own open out of tree kernel driver, but they specifically had him (or at least allowed him) to keep working on nouveau.
Also, if they really wanted to EEE nouveau into oblivion, they would need to get every single prominent nouveau, nova, and NVK developer on payroll simultaneously before they silence them all because once one gets silenced why would any of the others even consider an NVIDIA offer? Especially those already employed at Red Hat? It doesn't really make sense to me as an EEE tactic.
What has been apparent over the past few years is that NVIDIA seems to be relaxing their iron grip on their hardware. They were the only ones who could enable reclocking in such a way that it would be available to a theoretical open source driver and they did exactly that. They moved the functionality they wanted to keep hidden into firmware. They had to have known that doing this would enable nouveau to use it too.
Also, they're hopping on this bandwagon now that NVK is showing promise of being a truly viable gaming and general purpose use driver. Looking at the AMD side of things, they did the same thing back when they first started supporting Mesa directly. They released some documentation, let the community get a minimally viable driver working, and then poured official resources into making it better. I believe the same situation happened with the Freedreno driver, with Qualcomm eventually contributing patches officially. ARM also announced their support of the Panfrost driver for non-Android Linux use cases only after it had been functionally viable for some time. Maybe it's a case of "if you can't beat them, join them" but we've seen companies eventually start helping out on open drivers only after dragging their feet for years several times before.
I'm cautiously optimistic. While I could see NVIDIA hiring him to stifle nouveau development, it doesn't really seem worth it when he already quit as maintainer and Red Hat is already working on nova, a replacement for nouveau. I got into Linux with Ubuntu 6.06 and remember the situation then. NVIDIA and ATI both had proprietary drivers and little open source support, at least for their most recent chipsets of the time. I was planning on building a new PC and going with an NVIDIA card because ATI's drivers were the hottest of garbage and I had a dreadful experience going from a GeForce 4 MX420 to a Radeon X1600Pro. However, when AMD acquired ATI they released a bunch of documentation. They didn't immediately start paying people to write FOSS Radeon drivers, but the community (including third party commercial contributors) started writing drivers from these documents. Radeon support quickly got way better. Only after there was a good foundation in place do I remember seeing news about official AMD funded contributors to the Mesa drivers. I hope that's what we're now seeing with NVIDIA. They released "documentation" in the form of their open kernel modules for their proprietary userspace as well as reworking features into GSP to make them easier to access, and now that the community supported driver is maturing the see it viable enough to directly contribute to.
I think the same may have happened with Freedreno and Panfrost projects too.
This is my cautious optimism here. I hope they follow this path like the others and not use this to stifle the nouveau project. Besides, stifling one nouveau dev would mean no other nouveau/nova/mesa devs would accept future offers from them. They can't shut down the open driver at this point, and the GSP changes seem like they purposely enabled this work to begin with. They could've just kept the firmware locked down and nouveau would've stayed essentially dead indefinitely.
I just use the default case for the most part. I have a third party case from Amazon with a larger internal storage compartment I use when traveling as I can fit a battery bank, bluetooth earbuds, and extra cables in it.
Fuck Riot. Never playing their games again. If you're going to have a shitty anticheat at least give people the option to play in anticheat disabled lobbies. Besides, they should be doing anticheat at the server level not spying on the boot sequence of client PCs. That shit is unnecessary for a fucking banking app let alone a goddamn game. It's just a game, let us enjoy it rather than making such a ridiculously over the top response to cheating.
Yeah, this headline is stupid ragebait. RISC-V development board company chooses RISC-V chip for their latest RISC-V development board doesn't have the same level of nonsensical anti-China rage in it.
Why support closed source software that hassles you when 7-zip is open source and works great?
I prefer the USB port to be on the bottom, but very few phones (at least in the smartphone era) even tried to move the USB port. Headphone jacks were frequently on top. I like the USB port on the bottom in the center so it can sit on a stand with a cutout in the center (which are pretty common).
It's not just 32 on 64 bit, new Macs use ARM64 processors so x86/x86_64 code is effectively obsolete on Mac. I would love to see Valve pour resources into a cross platform x86 on ARM64 emulation layer though, it would benefit Linux as well.
Doesn't really matter as long as the jack exists in the first place.
I have both, mainly got the Ally as an experiment. The Deck is absolutely the way to go. Windows is a dreadful experience in general, but especially so on a handheld. No touchpads means awful mouse control, but Windows means an OS designed around mouse control. Asus' software feels like a big hack (because it is) haphazardly glued on top of a stock Windows desktop. Steam Big Picture works OK but the Steam menus are limited in functionality compared to using them on SteamOS and the Deck. Meanwhile, the Deck is an incredibly polished product and the SteamOS interface is controller-first. You can still go to the desktop and use it as a PC, but you won't wind up there accidentally like you will on the Ally. The SteamOS gaming mode is built around operating with a controller and everything works well.
As for running Linux on the Ally? It is doable, but the experience is nowhere as good as the Deck. No seamless sleep and resume< issues with button mapping, limited tweaking of power limits, and more. Just get a Deck OLED and be happy.
Get some HDMI to VGA adapters, the kind that screw into the VGA port and then have an HDMI port. I have a bunch of old VGA monitors I use with Raspberry Pis and as test displays when working on PCs and never have to deal with the annoyances of VGA since they're basically HDMI displays now.
I started using Linux with Ubuntu 6.06 and at the time I was really into the game Jedi Academy. It used OpenGL and thus ran fairly well on Wine. I upgraded from an NVIDIA GeForce 4 MX420 to an ATI Radeon X1600Pro and the ATI drivers were absolute garbage so I kinda gave up on Linux gaming for a while. I was set on going NVIDIA on my next PC but around that time AMD bought ATI and opened up their documentation, leading to rapid improvements in the open source AMD drivers. Went with a Radeon HD 5870 and not long after I built that PC I was gaming in Wine again, though poorly on non OpenGL games still. Then Steam for Linux officially released and a lot of native games became available but I was still running Windows Steam in Wine as native Steam didn't play Windows games. Then the Gallium Nine project offered a way to play DX9 games with significantly improved performance and I played a lot of Skyrim on Linux as well as a lot of other DX9 games. Then Vulkan happened and soon DXVK and Proton and the modern Linux gaming landscape evolved quite rapidly until we got to where we are today.
Most of the laptops I've seen the external port is connected to the dGPU.
Most gaming laptops these days don't support true GPU switching as it requires a hardware mux to switch the display between the GPUs. Every gaming laptop I've used from the past decade has been muxless and only used render offloading.
I think it's the other way around. NVIDIA's marketing name for render offloading (muxless) GPU laptops is NVIDIA Optimus so when the Mesa people were creating the open source version they called it PRIME.
Most gaming laptops these days don't do GPU switching anyways. They do render offloading, where the laptop display is permanently connected to the integrated GPU only. When you want to use the discrete GPU to play a game, it renders the game frames into a framebuffer on the discrete GPU and then copies the completed frame over PCIe into a framebuffer on the iGPU to then output it to the display. On Linux (Mesa), this feature is known as PRIME. If you have two GPUs and you do DRI_PRIME=1
<command>
, it will run the command on the second GPU, at least for OpenGL applications. Vulkan seems to default to the discrete GPU no matter what. My laptop has an AMD iGPU and an NVIDIA dGPU and I've been testing the new NVK Mesa driver. Render offloading seems to work as expected. I would assume the AMD Mesa driver would work just as well for render offloading in a dual AMD situation.Hopefully they can find a new home. I am ashamed of GitLab. I used to love it but they get worse and worse by the day. Maybe Codeberg would be a better home. Nintendo can't kill this, there will always be new places to host software and it's open source.
It's absolutely ridiculous they took it down even though Nintendo didn't DMCA the Suyu project directly. Shitty corporate cover-our-ass behavior at its finest.
Same. I put together a knock off Prusa i3 kit in 2014 that cost me like $600 and most of the parts were themselves 3D printed. Not a bad thing but they were really rough prints. It printed OK for the time but was an endless source of annoyance. In comparison, the Ender 3 Pro just basically worked out of the box with minor bed leveling tweaks and everything else has just been minor quality of life improvements. It's great.
I'm just using a Dell PC monitor (21" 1080p) from like 2010. It supports HDMI but I don't know about CEC. Either way it could just put the monitor to sleep and that would be fine, doesn't require CEC. I just am not sure of a way to trigger this manually when I'm done using it.
This...actually seems like a good use of AI? I generally think AI is being shoehorned into a lot of use cases where it doesn't belong but this seems like a proper place to use it. It's serving a specific and defined purpose rather than trying to handle unfiltered customer input or do overly generic tasks,