This is galaxies away from a good faith argument. You don't seem like a dumb person, so you're either engaging in bad faith for reasons all your own, or you're so defensive about any criticism directed towards your work that you don't realize how silly you're being. Either way, I think this is the end of the line for us. Hope you have a pleasant rest of your Sunday. I unfortunately have to keep working for a bit but will be done soon. Cheers!
Unsurprisingly, I disagree with your interpretation of the ending. I think your interpretation of the whole book says a lot more about you than it does about Vonnegut or other people; it's misanthropic, unempathetic, and patrician to the point of infantilizing others. I suspect that our views on what we as humans need to be fulfilled, what true freedom really is, and how we should treat each other are so far apart that there's no bridging it. I hope you one day you reconsider. Until then, it's been fun chatting. Good luck out there, friend.
Yeah this sounds like religion to me. Believe it is true and you will believe it is true.
Are you saying that reading and interpreting the work of one of the most beloved authors in the English language is "like religion?" If so, you could not be more wrong. Reading, interpreting, and reinterpreting the work of those who came before us is actually the very core of any academic pursuit. It's the most basic description of what every single academic does with their time.
Also, you didnt address what I wrote, only the argument you think I was making
I did, actually. I could not have addressed it more directly. Let me do it again, but this time greatly expanding it.
I am making the world a better place. Freeing humans from degrading filthy boring work. You know what really irked me the most about that novel? The population lived in a freaken utopia and couldn’t say one good thing about it.
It's been more than ten years since I read the book, but were Vonnegut a less subtle writer, that could be a literal line of dialog from one of the engineers in the book. I could imagine one of them defensively saying exactly that in an argument with the minister (whose name I forget).
You are frustrated that the engineers, through their ingenuity and hard work, have given the population a utopia, but the population is ungrateful. Your attitude is the same as the upper classes in that world. What you overlook is the world's inherently violent class structure, which is revealed as the book goes on. The lower classes in the books are relegated to meaningless existences in sad, mass-produced housing, physically segregated from the wealthy in Homestead. They are denied an active participation in society, made obsolete by the upper classes (wealthy engineers, which iirc are implied to keep it in the family), who control every aspect of society. Again, it's been a very long time since I read this so I'm hazy on the details, but in the book, some in the lower classes are trying to actively organize to challenge this class structure. They are brutally repressed. They are infiltrated by secret police, and when they rise up in protest, are met with state violence.
What you describe as utopia is actually a repressive regime that meets the subsistence needs of its lower classes in exchange for their unquestioning acceptance of the oligarchy's control of society, which they justify to themselves and to the lower classes as a technocratic utopia ("freeing humans from degrading filthy boring work," as you say), but which is also perfectly willing to subjugate the lower classes using deadly force if they dare to question the existing power structure.
How you describe the world is exactly how the regime chooses to portray itself, and how the upper classes, consisting of people like you and me, view the lower classes. In fact, viewing the lower classes as ungrateful for the upper classes' generosity is actually a staple of upper class attitudes throughout much of human history. At the beginning of the book, since we're only given an engineer's perspective, this is an understandable reading of the world. If you read the entire book and still finished it thinking that same thing, you completely and utterly missed the point.
You and I make technology for companies, which are mostly owned by rich people. Vonnegut is asking us to interrogate what the implied philosophy behind our work is, even if we do not intend it to be so. We try to make people free from tedious work, but if you simply ask the people who we're supposedly liberating from work, they hate us. This is not necessarily because they like the drudgery of their work, but because the wealthy people who employ us will simply lay them off, increasing corporate profits, but relegating the now-obsolete workers to the margins of society.
If the people you and I are "freeing ... from degrading filthy boring work" are actually further degraded by this so-called freedom, are we really freeing them? Maybe we should question how our society is organized if the people you and I work to "free" actually hate us for it, or as they'd put it, hate us for taking away their jobs.
I spent several years working on manufacturing and logistics automation, and I urge you to reconsider your interpretation of it.
Just from your comment, you totally missed the point of the book. It's not anti-automation. Your analysis is the exact false binary Vonnegut is interrogating. The book is actually a response to the exact attitude expressed in on your comment.
I'm happy to go into it, but Vonnegut is the master; no one will say it like he does, but you have to be open to it. If you react defensively, you'll come away thinking he's just anti technology, and that he must be wrong because technology is good. If you reread it with an open mind, or even reflect upon it again, you might find particularly important insights for the likes of you and me.
I'd been a software engineer for 15 years. In that time, in all the jobs I've had, I'd never once worked on anything that actually made people's lives better, nor did I ever hear anyone else in tech ever really dive into any sort of meaningful philosophical interrogation of what digital technology is for and how we should use it. I made a few cool websites or whatever, but surely there's more we can do with code. Digital technology is so obviously useful, yet we use it mostly to surveil everyone to better serve them ads.
Then i found cybernetics, though the work of Beer and others. It's that ontological grounding that tech is missing. It's the path we didn't take, choosing instead to follow the California ideology of startups and venture capital and so on that's now hegemonic and indistinguishable from the digital technology itself.
Even beers harshest critic is surely forced to admit that he had a hell of a vision, whereas most modern tech is completely rudderless
That is such a massive oversimplification of how computer learning works that it's neither here nor there.
Also, automation might work in some cases and not others. Sometimes different things are similar, and sometimes they're different. Just because similar arguments have been made before about different things doesn't mean you get to discount them now in an different situation.
I assume we're both living in the US? I didn't say anything about an unrealistic pedestrian utopia. I said we should improve city planning and invest in public infrastructure instead or relying exclusively on tech companies to solve our total lack of willpower and imagination in building our physical spaces. The state of American infrastructure is absolutely pathetic.
Like I said, everything is normalized by miles or discussed inbthe context of distance driven.
We don't have concrete numbers for the real world cars, but we absolutely have enough to make educated estimates, and those line up with the existing data.
In a few months, the cars had some 55 incidents with emergency services. iirc there were only a couple hundred cars. There are millions upon millions of cars in San Francisco driving orders of magnitude more miles than that, and the emergency services personnel are actively flagging the self driving cars as a serious problem.
I'd obviously prefer to have better real world data. The data that we do have is consistent in showing self driving cars significantly underperform compared to humans per mile driven by several orders of magnitude, as Doctorow mentioned in that piece, and I quoted. That data that does exist is also consistent with the emerging picture, albeit the numbers for that aren't in yet.
Afaik, there isn't a single piece of data in existence in favor of self driving cars, but there is plenty against. If you have something to the contrary, lmk, because that would greatly change my opinion. I fucking want a self driving car. They sound rad as hell. But I don't want to organize our entire society around more big tech vaporware.
I'm not sure what you mean? All the numbers I used are explicitly normalized by or discussed in the context of distance driven. My comment contains the phrase "miles driven" several times. Docotorws piece that I quote from goes into more detail, again normalized by miles driven.
I’m not sure your second point is as strong as you believe it to be. Do you have a specific example in mind? I think most vehicle problems that would require an emergency responder will have easy access to a tow service to deal with the car with or without a human being involved. It’s not like just because a human is there that the problem is more easily solved. For minor-to-moderate accidents that just require a police report, things might get messy but that’s an issue with the law, not necessarily something inherently wrong with the concept of self driving vehicles.
The fire department in SF has made it very clear that these cars are a PITA for them. They are actively driving through emergency situations, cannot follow verbal instructions, drive over fire hoses, etc.
Also, your first point is on shaky ground, I think. I don’t know why the metric is accidents with fatalities,
Fatalities is just the number we have to compare. Self-driving car companies have been publishing a simulated fatality metric for a while now. I totally agree there are other ways to think about it. My point is that AV companies have a narrative that humans are actually bad at driving, and I think this comparison pokes a hole in that story.
but since that’s what you used, what do you think having fewer humans involved does to the chance of killing a human?
I'm not sure, actually. The vast majority of driving is solo trips, so I'd expect not that much? There are some studies suggesting that people might actually use cars more if self-driving cars become a reality:
And that really gets to the heart of my problem with the self-driving cars push. When faced with complex problems, we should not assume there is a technological solution. Instead, we should ask ourselves to envision a better world, and then decide what technologies, if any, we need to get there. If self-driving cars are actually a good solution to the problem, then by all means, let's make them happen.
But I don't think that's what's happening here, and I don't think they are. American cities are a fucking disaster of planning. They are genuinely shameful, forcing their inhabitants to rely on cars, an excessively wasteful mode of transportation, all in a climate crisis. Instead of coming together to work on this problem, we're begging our technological overlords to solve them for us, with an added drawback of privatizing our public infrastructure.
Every time one of these things happens, there's always comments here about how humans do these things too. Two responses to that:
First, human drivers are actually really good at driving. Here's Cory Doctorow explaining this point:
Take the much-vaunted terribleness of human drivers, which the AV industry likes to tout. It's true that the other dumdums on the road cutting you off and changing lanes without their turn-signals are pretty bad drivers, but actual, professional drivers are amazing. The average school-bus driver clocks up 500 million miles without a fatal crash (but of course, bus drivers are part of the public transit system).
Even dopes like you and me are better than you may think – while cars do kill the shit out of Americans, it's because Americans drive so goddamned much. US traffic deaths are a mere one per 100 million miles driven, and most of those deaths are due to recklessness, not inability. Drunks, speeders, texters and sleepy drivers cause traffic fatalities – they may be skilled drivers, but they are also reckless.
There's like a few hundred robot taxis driving relatively few miles, and the problems are constant. I don't know of anyone who has plugged the numbers yet, but I suspect they look pretty bad by comparison.
Second, when self-driving cars fuck up, they become everyone else's problem. Emergency service personnel, paid for by the taxpayer, are suddenly stuck having to call corporate customer service or whatever. When a human fucks up, there's also a human on the scene to take responsibility for the situation and figure out how to remedy it (unless it's a terrible accident and they're disabled or something, but that's an edge case). When one of these robot taxis fucks up, it becomes the problem of whoever they're inconveniencing, be it construction workers, firefighters, police, whatever.
This second point is classic corporate behavior. Companies look for ways to convert their internal costs (in this case, the labor of taxi drivers) into externalities, pushing down their costs but leaving the rest of us to deal with their mess. For example, plastic packaging is much, much cheaper for companies than collecting and reusing glass bottles or whatever, but the trash now becomes everyone else's problem, and at this point, there is microplastic in literally every place on Earth.
🤡