Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)MI
Posts
0
Comments
163
Joined
2 yr. ago

  • In the meantime, please always use Autopilot and FSD Beta as intended, which means with your hands on your steering wheels and your eyes on the road.

    This isn't how people work if you aren't driving then your reactions will on average be substantially slower than if you were attending to the task directly and you have no reason to expect the sudden death wish. If you have to babysit it with your hands on the wheel poised to avert death at any moment it is fact much worse than nothing.

  • You ably demonstrate your own inability to listen. The monitor on my right hand side right here as I type this isn't blurry there is no amount of proving that it MUST be blurry that is more persuasive than the fact that as I type this I'm looking at it.

    Furthermore I didn't say that the existence of desktops obviated the need to worry about the impact of resolution/scaling on battery life. I said that the impacts on battery life were both minimal and meaningless because mixed DPI concerns by definition concerns exclusively desktops and laptops which are plugged into external monitors at which time logically your computer is also plugged into power. In fact the overwhelming configuration for those which use external monitors is a dock which delivers both connectivity to peripherals and powers. If you are using a desktop OR a plugged in laptop the benefits of scaling more efficiently is zero.

    I started using Linux with the release of the very first release of Fedora then denoted Fedora "Core" 1. I'm not sure how you hallucinated that Wayland got 4 years of design and 8 years of implementation. First off by the end of the month it will be 15 years old so you fail first at the most basic of math. Next I'm guessing you want to pretend it got four year of design to make the second number look less egregious.

    With graphics programming relatively in its infancy X11 didn't require 15 years to become usable and Apple took how many years to produce their stack was it even one? Working with incredibly powerful hardware, with a wide variety of approaches well understood and documented 15 years is downright embarrassing. Much as I enjoy Linux the ecosystem is kind of a joke.

  • It doesn't require a meaningful or measurable difference in CPU/GPU to scale my third monitor. That is to say in practical effect actual usage of real apps so dwarfs any overhead that it is immeasurable statistical noise. In all cases nearly all of the CPU power is going to the multitude of applications not drawing more pixels.

    The concern about battery life is also probably equally pointless. People are normally worrying about scaling multiple monitors in places where they have another exciting innovation available... the power cord. If you are kicking it with portable monitors at the coffee shop you are infinitely more worried about powering the actual display more so than GPU power required to scale it. Also some of us have actual desktops.

    Furthermore, scaling up and down in multiple passes, instead of letting the clients doing it in “one go” and have the compositor scan it directly onto your screen, leads to problems in font rendering

    There are some nasty side effects

    There just aren't. It's not blurry. There aren't artifacts. It doesn't take a meaningful amount of resources. I set literally one env variable and it works without issue. In order for you to feel you are justified you absolutely NEED this to be a hacky broken configuration with disadvantages. It's not its a perfectly trivial configuration and Wayland basically offers nothing over it save for running in place to get back to the same spot. You complain about the need to set an env var but to switch to wayland would be a substantial amount of effort and you can't articulate one actual benefit just fictional deficits I can refute by turning my head slightly.

    Your responses make me think you aren't actually listening for instance

    11 is utterly broken, just admit it. You are welcome to develop another X11 if you want.

    Why would I need to develop another X11 I believe I shall go on using this one which already supported high and mixed DPI just fine when Wayland was a steaming pile of shit nobody in their right mind would use. Apparently the “nobody” includes GTK, Qt, SDL..

    Please attend more carefully. Scaling and High DPI was a thing on X back when Wayland didn't work at all. xrandr supported --scale back in 2001 and high DPI support was a thing in 2012. Wayland development started in 2008 and in 2018 was still a unusable buggy pile of shit. Those of us who aren't in junior high school needed things like High DPI and scaling back when Wayland wasn't remotely usable and now that it is starting to get semi usable I for one see nothing but hassle.

    I don't have a bunch of screen tearing, I don't have bad battery life, I have working high DPI, I have mixed DPI I don't have a blurry mess. These aren't actual disadvantages this is just you failing to attend to features that already exist.

    Imagine if at the advent of automatic transmissions you had 500 assholes on car forums claiming that manual transmission cars can't drive over 50MPH/80KPH and break down constantly instead of touting actual advantages. It's obnoxious to those of us who discovered Linux 20 years ago rather than last week.

  • Nothing is set automatically I run a window manager and it starts what I tell it to start. I observed that at present fewer env variables are now required to obtain proper scaling. I did not personally dig into the reasoning for same because frankly its an implementation detail. I just noted that qt apps like dolphin and calibre are scaled without benefit of configuration while GTK apps like Firefox don't work without GDK_SCALE set.

    X actually exposes both the resolution and physical size of displays. This gives you the DPI if you happen to have mastered basic math. I've no idea if this is in fact used but your statement NOTHING provides that is trivially disprovable by runing xrandr --verbose. It is entirely possible that its picking up on the globally set DPI instead which in this instance would yield the exact same result because and wait for it.

    You don't in fact actually even need apps to be aware of different DPI or dynamically adjust you may scale everything up to the exact same DPI and let X scale it down to the physical resolution. This doesn't result in a blurry screen. The 1080p screen while not as pretty as the higher res screens looks neither better nor worse than it looks without scaling.

    Why would I need to develop another X11 I believe I shall go on using this one which already supported high and mixed DPI just fine when Wayland was a steaming pile of shit nobody in their right mind would use. It probably actually supported it when you yourself were in elementary school.

  • Whales appear to have a higher degree of understanding of the universe I and others feel it is more akin to eating humans than chickens. If you don't agree with that premise you probably wont agree with anything else.

  • Outside of your fantasies high DPI works fine. Modern QT apps seem to pick it up fairly automatically now and GTK does indeed require a variable which could trivially be set for the user.

    Your desktop relies on a wide variety of env variables to function correctly which doesn't bother you because they are set for you. This has literally worked fine for me for years. I have no idea what you think you are talking about. Wayland doesn't work AT ALL for me out of the box without ensuring some variables are set because my distro doesn't do that for me this doesn't mean Wayland is broken.

  • This wasn't true in 2003 when I started using Linux in fact the feature is so old I'm not sure exactly when it was implemented. You have always been able to have different resolutions and in fact different scaling factors. It works like this

    You scale your lower DPI display or displays UP to match your highest DPI and let X scale down to the physical size. HIGHER / LOWER = SCALE FACTOR. So with 2 27" monitors where one is 4k and the other is 1080p the factor is 2, a 27" 4K with a 24" 1080p is roughly 1.75.

    Configured like so everything is sharp and UI elements are the same size on every screen. If your monitors are vertically aligned you could put a window between monitors and see the damn characters lined up correctly.

    If you use the soooo unfriendly Nvidia GPU you can actually configure this in its GUI for configuring your monitors. If not you can set with xrandr the argument is --scale shockingly enough

    Different refresh rates also of course work but you ARE limited to the lower refresh rate. This is about the only meaningful limitation.

  • It's painful because the developers took 14 years to produce something semi usable while ignoring incredibly common use cases and features for approximately the first 10 -12 years of development

  • I just passed scale to xrandr after computing the proper scale and then used the nvidia-settings gui to write current configuration to xorg.conf its not incredibly hard basically all you are doing is scaling lower DPI items up to the same resolution as your highest dpi item and letting it scale down the correct physical size. For instance if you have 27' monitors that are 4K and 1080p you just scale the 1080 ones by 2 if you have a 4k 27 and a 1080 24" its closer to 1.75. The correct ratio can be found with your favorite calculator app.

    You can set this scaling directly in nvidia-settings come to think of it where you set viewport in and viewport out.

  • "A little blurred" You are probably one of the fellows who walks around with a phone with a spider web of cracks because it "still works". Not sure why you imagine a blurry screen is a usable or acceptable thing. Excuse me while I return to using my 4k + 4k + 1080p 3 screen arrangement in which NONE of them are blurry and in which an app that is moved from a->b remains the same size because UI elements are scaled to the same identical size.

  • rerequisites

     
            wayland, including wayland-scanner and the base protocols
        libxkbcommon
        libtls (either from libressl, or libretls)
        A compositor making use of wlroots, or (on an experimental basis) KDE, or (if all else fails) the willingness to run questionable networking utilities with the privileges to access /dev/uinput
        wl-clipboard for clipboard support (only works on wlroots, not KDE/GNOME)
    
      

    ...

    In my case, a 'uinput' group is created, and a udev rule is used to modify the permissions appropriately:

    /etc/udev/rules.d/49-input.rules, in my case

    KERNEL=="uinput",GROUP:="uinput",MODE:="0660"

    From here, one could assign one's users to this group, but doing so would open up uinput to every program, with all the potential issues noted in the first paragraph. The safest approach is probably setgid:

    as root -- adjust path as needed

    chown :uinput waynergy chmod g+s waynergy

    If this doesn't still doesn't seem to work (as in #38) be sure that the uinput module is loaded properly. This might be done by creating a file /etc/modules-load.d/uinput.conf with the contents of uinput

    This is compared to just installing synergy which takes 7 seconds. I'm not sure why anyone imagines this is a credible alternative.

  • The author is a Wayland fanboy which almost by definition makes them a moron. We are talking about folks who were singing the same song like 7 years ago when the crack they were promoting was outrageously broken for most use cases.