...
theluddite @ theluddite @lemmy.ml Posts 12Comments 348Joined 2 yr. ago

I agree. I've actually written about this.
It gets solved by planning. Actual long term planning that includes the relevant stakeholders. Currently everything is run by and for VCs who only plan in terms of funding rounds and exits.
Mark Zuckerberg, Jeff Bezos, Bill Gates. Behind Every Self-Made Millionaire is a Father with Money
You're not wrong but I think you're making too strong a version of your argument. Many people, including wealthy people, are genuinely, deeply moved by art. I love the symphony, opera, and ballet. If I were rich I'd absolutely commission the shit out of some music and get a lot of real joy out of that.
Yeah, I totally see that. I want to clarify: It's not that I don't think it's useful at all. It's that our industry has fully internalized venture capital's value system and they're going to use this new tool to slam on the gas as hard as they can, because that's all we ever do. Every single software ecosystem is built around as fast as possible, everything else be damned.
Yeah, I think helping people who don't know how to code and letting them dabble is a great use case. I fully encourage that.
I don't think it's actually good for generating scaffolding in terms of helping people write quality software, but I do agree with you that that's how people are going to use it, and then the expectation is going to become that you have to do things that fast. It's kind of mindboggling to me that anyone would look at the software industry and decide that our problem is that we don't move fast enough. Moving too fast for speed's own sake is already the cause of so many of our problems.
I do software consulting for a living. A lot of my practice is small organizations hiring me because their entire tech stack is a bunch of shortcuts taped together into one giant teetering monument to moving as fast as possible, and they managed to do all of that while still having to write every line of code.
In 3-4 years, I'm going to be hearing from clients about how they hired an undergrad who was really into AI to do the core of their codebase and everyone is afraid to even log into the server because the slightest breeze might collapse the entire thing.
LLM coding is going to be like every other industrial automation process in our society. We can now make a shittier thing way faster, without thinking of the consequences.
I don't think that sounds like a good way to make a good paper that effectively communicates something complex, for the reasons in my previous comment.
Amen. In fact, I wrote a whole thing about exactly this -- without an LLM! Like most things I write, it took me many hours and evolved many times, but I take pleasure in communicating something to the reader, in the same way that I take pleasure in learning interesting things reading other people's writing.
I do a lot of writing of various kinds, and I could not disagree more strongly. Writing is a part of thinking. Thoughts are fuzzy, interconnected, nebulous things, impossible to communicate in their entirety. When you write, the real labor is converting that murky thought-stuff into something precise. It's not uncommon in writing to have an idea all at once that takes many hours and thousands of words to communicate. How is an LLM supposed to help you with that? The LLM doesn't know what's in your head; using it is diluting your thought with statistically generated bullshit. If what you're trying to communicate can withstand being diluted like that without losing value, then whatever it is probably isn't meaningfully worth reading. If you use LLMs to help you write stuff, you are wasting everyone else's time.
Top-down bureaucracies are bad at adoption. That's just obvious at this point. If you want to use a computer to fix this problem, you can't simply automate the existing structure. You need to actually think about how you can use the computer to do something qualitatively and structurally different than what we're currently doing, instead of the same basic thing but faster and with more data.
This is why I say that capitalism uses computers backwards. I even used online dating as an example when I wrote that almost a year ago. If you think within capitalism, and you incorporate yourself as a capitalist firm, even if you try to do good things, the structure of your solution will reflect that of your organization, and many of our problems simply don't respond well to that.
A +1 for Woodford Reserve and Knob Creek, especially Woodford's nicer offerings. Those are great choices.
I'm going to disagree on the Basil Hayden and the Bulleit. I wouldn't recommend them to a bourbon enthusiast (or to anyone really). Bulleit in particular I think doesn't really offer a lot of the classic bourbon experience that someone who is into bourbon might get excited about. To me, it drinks quite hot and is pretty thin.
I'm going to recommend Old Forester 1910. A lot of people prefer the 1920, which is a bit pricier, and I can see why they might, but I actually prefer the 1910. It's complex enough to think about but easy enough to just enjoy. It's got some classic sweet bourbon flavors (people usually describe the flavor as deserty: molasses, vanilla, etc.), and a wonderfully luxurious mouthfeel that's very bourbon and sticks around for a long time.
Yes 100%. Once you drop the false equivalence, the argument boils down to X does Y and therefore Z should be able to do Y, which is obviously not true, because sometimes we need different rules for different things.
Permanently Deleted
I didn't say there are no good uses for data. Of course there are! I even wrote "useless things" in the comment to distinguish from real uses.
Personally I think self driving cars are never going to happen and the LLM coding hype fundamentally misunderstands what software does and is actually for, but even though I don't agree with your examples, only a complete fucking moron would think computing in general is useless. My point is that current computing practices are insanely wasteful.
Permanently Deleted
That's a bad faith gotcha and you know it. My lemmy account, the comment I just wrote, and the entire internet you and I care about and interact with are a tiny sliver of these data warehouses. I have actually done sysadmin and devops for giant e-commerce company, and we spent the vast majority of our compute power on analytics for user tracking and advertising. The actual site itself was tiny compared to our surveillance-value-extraction work. That was a major e-commerce website you've heard of.
Bitcoin alone used half a percent of the entire world's electricity consumption a couple of years ago. That's just bitcoin, not even including the other crypto. Now with the AI hype, companies are building even more of these warehouses to train LLMs.
Permanently Deleted
If you take it as a given that we should have giant warehouses full of computers using tons of energy while doing mostly pointless tasks during a climate emergency, then yes, it's a great idea.
From that same article stub:
The nonprofit DataKind has a partnership with John Jay College of Criminal Justice, where 44% of students are Hispanic, to run a predictive AI program that helps identify students — especially those from low-income families — who are in danger of dropping out because of grades or other factors.
This is a very dangerous path. I recognize it thanks to Dan Mcquillan, who writes about this a lot. Governments using algorithmic tools to figure out who needs special services ends up becoming automated neoliberal austerity. He frequently collects examples. I just dug up his mastodon and here's a recent toot with three: https://kolektiva.social/@danmcquillan/111207202749078945
Also, the main headline is about automated text translations for calls, which is now AI. Ever since ChatGPT melted reporters' brains, everything has become AI. Every time I bring this up, some pedantic person tells me that NLP (or machine vision or LLMs) is a subfield of AI. Do you do this for any other field? "Doctors use biology to solve disease," or "Engineers use physics to to build bridge." Of course not, because it's ridiculous marketing talk that journalists should stop repeating.
Computers aren't people. AI "learning" is a metaphorical usage of that word. Human learning is a complex mystery we've barely begun to understand, whereas we know exactly what these computer systems are doing; though we use the word "learning" for both, it is a fundamentally different process. Conflating the two is fine for normal conversation, but for technical questions like this, it's silly.
It's perfectly consistent to decide that computers "learning" breaks the rules but human learning doesn't, because they're different things. Computer "learning" is a a new thing, and it's a lot more like creating replicas than human learning is. I think we should treat it as such.
Copyright is broken, but that's not an argument to let these companies do whatever they want. They're functionally arguing that copyright should remain broken but also they should be exempt. That's the worst of both worlds.
Honestly I almost never have to deal with any of those things, because there's always a more fundamental problem. Engineering as a discipline exists to solve problems, but most of these companies have no mechanism to sit down and articulated what problems they are trying to solve at a very fundamental level, and then really break them down and talk about them. The vast majority of architecture decisions in software get made by someone thinking something like "I want to use this new ops tool" or "well everyone uses react so that's what I'll use."
My running joke is that every client has figured out a new, computationally expensive way to generate a series of forms. Most of my job is just stripping everything out. I've replaced so many extremely complex, multi-service deploy pipelines with 18 lines of bash, or reduced AWS budgets by one sometimes two orders of magnitude. I've had clients go from spending 1500/month on AWS with serverless and lambda and whatever other alphabet soup of bullshit services that make no sense to 20 fucking dollars.
It's just mind-blowing how stupid our industry is. Everyone always thinks I'm sort of genius performance engineer for knowing bash and replacing their entire front-end react framework repo that builds to several GB with server side templating from 2011 that loads a 45kb page. Suddenly people on mobile can actually use the site! Incredible! Turns out your series of forms doesn't need several million lines of javascript.
I don't do this kind of work as much anymore, but up until about a year ago, it was my bread and butter..