The Cult of AI. How one writer's trip to an annual tech conference left him with a sinking feeling about the future
kromem @ kromem @lemmy.world Posts 6Comments 1,656Joined 2 yr. ago
Not replace, but enhance and support.
You often see junior developers talk about how they appreciate being able to ask ChatGPT questions without being judged.
I could definitely see education products aimed at long tail tutoring support helping significantly in better equipping students to succeed in a given teacher's classroom.
Off to Be the Wizard by Scott Meyer, part of his "Magic 2.0" series
That's a weird take given the actual numbers and relative results per company, but ok.
Microsoft's price didn't change much at all and is still trading at a 35 P/E ratio (17% higher than Apple's) despite being neck and neck in the race for the largest company in the market and allegedly not having its AI efforts actually change product usage. Clearly the market is still pricing it as if it's going to grow more somehow.
AMD is down, but since when is AMD an "AI company"? That's Nvidia through and through, who is still double digit percentage points up from a month ago, and trading at a 81 P/E ratio. The market losing faith in Nvidia's competition seems more like the opposite of this headline, given it's the key area where Nvidia has a market advantage over AMD.
Google, whose revenue is 90% ads, is down in response to falling short on ad sales. Which if anything may be a result of increased chatbot usage reducing search volume and Google's chat offering being the Bing of AI chatbots.
This is clickbait analysis.
Any universe. I recall reading a fun fantasy book that played with this idea under the premise of a guy who discovered he was in a simulation and hacked admin access, decided to go back into Arthurian medieval times to be a wizard with his new tech powers, and ended up there with a number of other sysadmins who had the same idea.
It wasn't the best written book, but the concept was definitely novel.
Yeah, after all that Baldur's Gate 3 early access thing came out terribly.
Realistically, probably dead.
Which might also be the deciding factor in why there's a post-scarcity environment.
There's also the ethical conundrum once AGI exists of dooming new intelligent life to mortal embodiment such that they are doomed to almost certainly die, whereas new intelligence that's disembodied could migrate from host to host until the end of all civilization.
At a certain point, I'm not sure it's still ethical to bring new mortal life into a dying world.
(Though I kind of already feel that way.)
"A sufficiently advanced technology is indistinguishable from magic."
My favourite conspiracy theory is that the US government is behind most conspiracy theories.
Look into Gene Pope, the guy who bought up the National Enquirer and Weekly World News and turned them into conspiracy fodder.
Would you expect that he had graduated MIT in only three years?
How about that his job right before buying the Enquirer was in the CIA's psyops division?
Hope things go well when OSHA pushes him out of the next to make sure he's wearing his safety equipment.
The problem is that most people overestimate the average person too.
So it's more like "imagine the average person, and realize around 70% of the population is dumber than that."
Even on physics you can't trust her necessarily.
She's prone to pushing fringe theoretical physics ideas without contextualizing the degree to which they are fringe.
Not a bad resource for physics explanations and discussion - just take with a grain of salt.
Yeah, that's pretty much literally written into the copyright laws...
See, this is the thing people don't realize when they think generative AI is going to reduce headcounts overall.
Corporations suck. The entire reason they exist is because of the high transactional costs surrounding labor (there's a Nobel winning economics essay on this from the turn of the 20th century called "the nature of the firm").
They will reduce value and increase price as much as possible because they only exist to be a middleman between the consumer and the producer.
But right now there's no alternative. It's crazy expensive to make AA and up games so you need to target mass market appeal to get the money for it and usually need to crawl up finance bros' asses who don't even play games and look down on those who do.
That's all about to change dramatically.
Co-op studio structures where employees are owners, smaller teams with large aspirations, franchises with small but dedicated fan bases - these largely died out in the 90s besides remnant very indie groups as transactional costs to produce a game went through the roof and those costs are about to turn around.
Yes, gen AI means less people are needed to make a game. But it doesn't mean less people will be making games. It means there will be more games, and games coming from people with vision rather than coming from people with a quarterly statement they are trying to maximize.
Hello Games was a team of around a dozen people, and while it was a bumpy road, using procgen allowed them to build an entire universe. Well procgen and a whole host of other tools are about to suck a lot less and be much more accessible to even small studios to make ambitious games.
My hope is that we see things happen rapidly enough that many of the thousands of devs who have lost their jobs at mega-corps will be able to reorganize to take on the Goliaths and win rather than be forced to move on to other industries.
A shakeup is about to happen that's going to destroy the season pass, micro transaction, soulless meat grinder that's most large studio/publishers today - it's just maybe ~3 years out from the inflection point of no return.
But one thing is for certain - most of the largest games companies are woefully unprepared for what's coming and are about to be stepped all over like Blockbuster or Circuit City.
if the US government was smart
It's interesting as it's many of the MPAA/RIAA attitudes towards Napster/BitTorrent but now towards gen AI.
I think it reflects the generational shift in who considers themselves content creators. Tech allowed for the long tail to become profitable content producers, so now there's a large public audience that sees this from what's historically been a corporate perspective.
Of course, they are making the same mistakes because they don't know their own history and thus are doomed to repeat it.
They are largely unaware that the MPAA/RIAA fighting against online sharing of media meant they ceded the inevitable tech to other companies like Apple and Netflix that developed platforms that navigated the legality alongside the tech.
So for example right now voice actors are largely opposing gen AI rather than realizing they should probably have their union develop or partner for their own owned offering which maximizes member revenues off of usage and can dictate fair terms.
In fact, the only way many of today's mass content creators have platforms to create content is because the corporate fights to hold onto IP status quo failed with platforms like YouTube, etc.
Gen AI should exist in a social construct such that it is limited in being able to produce copyrighted content. But policing training/education of anything (human or otherwise) doesn't serve us and will hold back developments that are going to have much more public good than most people seem to realize.
Also, it's unfortunate that we've effectively self propagandized for nearly a century around 'AI' being the bad guy and at odds with humanity, misaligned with our interests, an existential threat, etc. There's such an incredible priming bias right now that it's effectively become the Boogeyman rather than correctly being identified as a tool that - like every other tool in human history - is going to be able to be used for good or bad depending on the wielder (though unlike past tools this one may actually have a slight inherent and unavoidable bias towards good as Musk and Gab recently found out with their AI efforts on release denouncing their own personally held beliefs).
That's not really how it would work.
If you want that outcome, it's better to train on as massive a data set as possible initially (which does regress towards the mean but also manages to pick up remarkable capabilities and relationships around abstract concepts), and then use fine tuning to bias it back towards an exceptional result.
If you only trained it on those works, it would suck at pretty much everything except specifically completing those specific works with those specific characters. It wouldn't model what the concerns of a prince in general were, but instead model that a prince either wants to murder his mother (Macbeth) or fuck her (Oedipus).
You might want to read this post from one of the EFF's senior lawyers on the topic who has previously litigated IP cases:
https://www.eff.org/deeplinks/2023/04/how-we-think-about-copyright-and-ai-art-0
In general, if the US government were smart (and not currently tearing itself apart) it would be creating a generative AI public service like the postal service, potentially even relying on public government documents and the library system for training.
Offer it at effectively cost for the public to use. Would drive innovation and development, nothing produced by it would be copyrightable, and it would put pressure on private options to compete against it.
We can still have the FedEx or DHL of gen AI out there, but they would need to contend with the public option being cheaper and more widely available for use.
First off, we know very little about the Minoans, since, y'know, Linear A hasn't been deciphered yet, but from what we do know, they had an incredibly gender-segregated society, far more than we have today. In lists of family members, for example, the men and the women are in completely separate lists, which would be pretty weird for a place that didn't have "arbitrary social constructs" like gender roles, and women seem to have been forbidden from most traditionally male jobs in their society.
There were distinct gender roles, all the way to the top (such as the lead religious figure as female and the lead ruling figure as male), but in accounting records where there was overlapping labor they were both paid the same (don't need to know Linear A to read numbers).
For the Hittites it's even worse
You'd be wise to keep in mind that these kingdoms cover a very long period of time when history and social norms shift around. A given individual in one generation does not reflect the society as a whole, but in turn the society at other periods doesn't necessarily reflect all the individual generations within it.
We can't look at America as a whole and use the records of women being denied the right to vote at one period of time to reflect a woman's role in America in a different time.
The historical reality is that Paduhepa was co-signing the treaty of Kadesh with Egypt alongside her husband, when the Egyptian pharoh's wife was not. Whether or not that was anomalous in the context of the entire Hittite empire is besides the point of whether or not at that point in time it was a political reality.
to act like Hittite queens were on par with Hittite kings in any way is completely false
I didn't say that. But I did say that she cosigned the first treaty in the historical record, and I think you'll have a hard time showing another example since where the wife of the ruler was co-signing a treaty unilaterally.
Their role in court was mostly religious
Here I think your modernism may be showing. In cultures where the chief deity was a goddess and the chief religious official for that goddess was the queen, you don't think maybe in antiquity the impact that religious role would have had would be more than superficial?
For example, you have Akhenaten inscribing in the dedication of Amarna an assurance that his wife didn't tell him to build the city there, but the Aten himself. So clearly at the time there were allegations that his wife, who had been depicted worshipping the Aten directly without her husband before this, was influencing his building of an entire new capital for the country.
Much like the paradigm outlined in Marinatos's Minoan Kingship and the Solar Goddess, bringing us full circle to another society with empowered women within their society.
In fact, in pretty much every place you find one of the empowered women in antiquity there's a connection to female deities.
So I think you underappreciate those "religious roles" in relation to the topic at hand.
While this is true in aggregate, consider Elon's Grok which then turned around and recognized trans women as women, black crime stats as nuanced, and the "woke mind virus" as valuable social progress.
This was supposed to be his no holds barred free speech AI and rather than censor itself it told his paying users that they were fucking morons.
Or Gab's Adolf Hitler AI which, when asked by a user if Jews were vermin, said they were disgusting for having suggested such a thing.
So yes, AI is a reflection of human nature, but it isn't necessarily an easily controlled or shaped reflection of that.
Though personally I'm not nearly as concerned about that being the continuing case as most people it seems. I'm not afraid of a world in which there's greater intelligence and wisdom (human or otherwise) but one in which there is less.