Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)TA
Posts
1
Comments
2,384
Joined
2 yr. ago

  • Every couple of years a shiny new AI development emerges and people in tech go “This is it! The next computing paradigm is here! We’ll only use natural language going forward!”. But then nothing actually changes and we continue using computers the way we always have, until the debate resurfaces a few years later.

    Reminds me a bit of graphical programming. Every couple of years someone comes up with the idea of replacing textual programming with some kind of graphical interface with arrows between nested boxed of various shapes and it inevitably fails.

  • Permanently Deleted

    Jump
  • Oh yeah, completely forgot to list the destruction of aesthetically pleasing views with public advertising boards and the waste of lifetime spent on watching ads in my list.

    Edit: also loss of life from depression and inferiority complexes caused by unrealistic life and body image goals in advertising

  • On a long enough timeline, neither is Rust probably, but such is the price of innovation.

    It is always so weird to me that people literally seem to believe that complex inventions like programming languages are something we got to perfection within 20 (in C's case) or 30 (in C++'s case) years of the advent of our industry. Especially considering an iteration cycle is somewhere in the decade or longer range for these. I would expect this to improve for at least a couple of hundred years before we reach the point where nothing new can be added to existing programming languages that is worth starting over with a new language to reap the benefits.

  • Permanently Deleted

    Jump
  • I don't know why you people all think AI has some magical capabilities. Tracking which aircraft is owned by whom isn't exactly a complex task if you have the raw data of flight movements (needed for ATC and otherwise independently tracked by plane spotters,...) and people movements.

  • Permanently Deleted

    Jump
  • So judging by how many terrorist attacks and similar events had perpetrators that were in theory known to the relevant government agencies that means those private vendors are now somewhere above the "incredibly bad at it" level?

  • Permanently Deleted

    Jump
  • You do not see the scope of the way advertising ruins our society.

    We have

    • a world-wide surveillance and profiling network collecting data about everyone for advertising purposes that is beyond the wildest dreams of authoritarian regimes
    • ruining our planet's climate through advertising efforts by the fossil fuel and automotive industries among others
    • ruining our health through advertising efforts by the tobacco, alcohol, sugar and similar industries
    • destroying our systems of government through advertising efforts for political causes and for populist candidates
    • keeping our population divided by using advertising methods to spread wedge issues
    • destroying genuine grass-roots causes and spreading astroturfing to hinder popular movements, unions,...

    Essentially everything you think of as propaganda is just applying advertising methods and societal structures for political purposes.

  • I guess it depends on the kind of job but at least with focus heavy tasks like programming I can already notice my error rate increase significantly by hour 7 or so. I can't imagine working 10 hours a day without spending half of the next day fixing stupid mistakes from being overly tired.

  • Realistically most people will still not run devices 24/7 at home. Data centers will always have a place for that kind of service, even if they host a lot of small, independently owned devices by the very same people whose home connections we are talking about.

  • I would say most people do not need a home connection that is in the same order of magnitude as the average data center server connection in use at the same time. Mostly because by definition there won't be many servers to transfer data from and to at that speed and the average person doesn't run too many connections in parallel.