These quartets cards with peak late 00s design
edinbruh @ edinbruh @feddit.it Posts 16Comments 310Joined 2 yr. ago

In addition to what the others have said, windows has already had its big paradigm change ("similar" to the change from x11 to Wayland that is happening) in the past. It was around 2007 with windows Vista. They also didn't get it quite right on the first try, but because Microsoft can do whatever they want, and in Linux you must convince the community that something is better, it was easier for them to just change everything under everyone's nose.
Hmmm. That's suspicious, there's a number of things in the way of video acceleration with that setup.
First of all, the fact that on fedora (ublue is a derivative of fedora) you need to install openh264 from dnf and not from Firefox extension manager, and then you still need to change some settings in about:config . Second, you are using a flatpak, I'm not sure if openh264 needs to be installed "inside the flatpak". And last, it might just be the Nvidia.
The first two would also affect AMD.
Mesa drivers for opengl, vulkan, etc. are likely already installed, what you need to install are the mesa-va and mesa-vdpau drivers for video acceleration. Other than that, you just need to make sure the GPU doesn't stay in power saving mode when you play.
Btw, video acceleration with Nvidia mostly works if you use this.
Mine is: I don't really listen to music
I doubt it's representative of the population. Because it's from self reporting, at best it's representative of those who advocate their favourite platform, which is just a particular portion of the population. Though it would be cool to see Wayland surpass X
Wait, is it on a population of 5000 computers? Bruh, why are we even looking at this?
But it's so cool! It looks like some brutalist paranormal manifestation. Like, I bet the distorted part of the oldest house (from the game Control) looked like this from the outside.
When they finish the words with G they can just add another layer of recursion and create BIG, an acronym for "BIG Is GNU". And from there go on like a word chain game
Who the hell does that?
The USB protocol was simple by design, so it could be implemented in small dumb devices like pen drives. More specifically, it used two couples of cables, one couple was for power and the other for data (four wires in total). Having a single half-duplex data line means you need some way of arbitrating who can send data at any time. The easiest way to do it is having a single machine that decides who gets to send data (master), and the easiest way to decide the master is to not do it and have the computer always do the master. This means you couldn't connect two computers together because they would both try to be the master.
I used the past tense because you may have noticed that micro USB have 5 pins and not 4, that's because phones are computers and they use the 5th pin to decide how to behave. If it's grounded they act as a slave (the male micro to male A cable grounds it). If it has a resistor (the otg cable has it) it act as master. And if the devices are connected with a wire on that pin (on some special micro to micro) they negotiate the connection.
When they made usb 3.0 and they realized that not having the 5th wire on the usb-A was stupid, so they put it (along side some extra data lines) that's why they have an odd number of wires. So with usb 3 you can connect computers together, but you need a special cable that uses the negotiation wire. Also I don't know what software you need for it to work.
Usb-c is basically two USB 3.0 in the same cable, so you can probably connect computers with that. But often the port on the devices only uses one, so it might not be faster. Originally they put the pins for two connections so you could flip the connector, but later they realized they could use them to get double speed.
Yeah that would be terrible, imagine if you were to run some updates and the package manager went like "Get
<name of the distro>
Pro! You will get better updates and support"Me:
- make the snapshot after the system is already broken
- Break it more
- Don't restore the snapshot because its old and you can fix it
I'm pretty sure daughter, son, would, you, rather, kill, yourself and cannot, are all in the Bible
AI upscaling, I think
I don't need 1, I already have ears
DVI-D is basically HDMI with a large connector, so nothing wrong with it
What is available is a x11 server, not more not less, it cannot be used for anything other than x11. If they made X12, it would not work on Nvidia, unless they wrote a new server, which they wouldn't.
You need to understand that the xorg server everyone use literally does not work on Nvidia, because it uses implicit sync, which is required by the Linux infrastructure. The only thing that works on Nvidia it's specifically their own proprietary server.
Nvidia does a lot of impressive stuff, but they have neglected the Linux scene for a long time, because it wasn't convenient, and it shows.
Edit: ...what was available... because Nvidia is gradually implementing things the correct way, and Wayland is becoming more and more usable with every driver update. Because, surprise surprise, it does depend on the drivers. Also, both Intel and AMD work perfectly with Wayland.
It's explicit sync, look at my other comment for links
False, xorg isn't written with support for Nvidia, when xwayland windows flickers on Nvidia it's an effect of xorg not working Nvidia.
The Nvidia driver is a closed source implementation of the xorg server written by Nvidia for Nvidia GPUs. Xorg was invented at a time when drivers were done like that.
Now xorg uses glamor (except on Nvidia) which is a driver that implements the server over opengl, so you don't need to implement the whole thing for every GPU. Except glamor doesn't work on Nvidia because Nvidia doesn't implement implicit sync, which is required by Linux, and that is what you see in xwayland (which uses glamor as well).
Wayland doesn't require writing a whole server, but it requires implementing GBM and implicit sync (as does everything on Linux, unless you are using Nvidia's proprietary corgi server). Nvidia refused GBM until a few years ago, and still refuses to implement implicit sync. Which is why explicit sync will solve most issues.
Frutiger Aero