Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)QU
Posts
3
Comments
229
Joined
2 yr. ago

  • Yes, in fact I did say I doubt that's the problem. I suggest you try either Simple Tab Groups or Sidebery, I never had issues with those with hundreds of tabs (I may have a slight issue with tab management XD), but they probably don't cover all the features of this extension

  • Yes, though now that the option exists setting it to anything would discriminate you from the group that set it the other way, I assume that the OP also considered that as problematic

  • Now I doubt that this is the problem, because those pages shouldn't become completely blank, but here's an explanation:

    A single page application is like a native app in the way it behaves, but made with web technologies and manually fit to the browser workflow, in the older, standard approach we just delivered single web pages and maybe added a little interactivity on top with JavaScript so we had routes like these:

    • example.org/index.html
    • example.org/about.html
    • etc.

    Each page is its own html file and you access it with its path.
    Now there is no rule that what goes into the URL bar should match 1 to 1 with the filesystem on the server so you could go on example.org/news.html and actually get served a page that is under 2023/07/28/something-interesting.html, there is logic running on the server that decides that if a client requests the news page, the server should send over today's news page.
    You'll see that all the time when you try to go to a page that doesn't exist anymore and so you are redirected to example.org/404.html saying you asked for some resource, but it wasn't found.

    In the same vein you can handle these routes on the client , you could send all the content to the user when they enter example.org, but you let JavaScript take care of what to display, so all the text of index, about, etc. is already on your PC and by clicking the links, which will have the same format, maybe minus the .html (though you could absolutely do that before too, just that here it conveys a specific meaning that in fact you aren't sending requests for html files, but just "routes"):

    • example.org/index
    • example.org/about
    • etc.

    And even if those links appear in your URL bar they have all been resolved on the client with JavaScript, by simply changing the content appearing on the screen and never getting a completely new page, that would have no problem always resolving.

    But when adding state to the mix, where you have something that is really a web app, you can't always get the same thing back, suppose I have a task list (in reference to technical React example) and create two items, I click on the second and I get example.org/tasks/2, I send you that link and you open the page for the first time on your computer, it won't work, it will probably fall back to a home route, because your state was different than mine, you had no task 2 yet, this is also called deep-linking. For that to work you have to store that state and since you're working in the browser you have to rely on its storge APIs, usually there is no storage that is guaranteed to be permanent on a browser, because its settings could affect when/if the storage gets cleared and suddenly I can't see my task 2 anymore either after some time.

  • what stops lemmy.world from connecting to them and sharing the same content ad free?

    That's true, I hadn't thought about that, but I wonder if he thinks that maybe Tumblr isn't popular enough to see a big loss in doing that, since at most it would just get more interactions from a community that wouldn't have joined a proprietary social network in the first place, so the missed ads would have always been missed, while in this way he gains more "free" engagement

    About the porn, they could probably defederate from the largest NSFW instances at least

  • I really don't understand why you're so damn salty, I did not claim that "it is just as good" at all, did you even read? I just told you to stop listening to those that tell you that without knowing what you do. And why do you keep insisting that every developer of free software is out there to show off their higher performance solutions, when it very clearly isn't their only goal (and in this case I never heard anyone bragging that GIMP is so stupidly fast that it completely outclasses Photoshop), most times it's actually an afterthought, to go back and improve the performance because after making something you realize with your own and the users' testing, that some processes take too long.