Top physicist says chatbots are just ‘glorified tape recorders’
PixelProf @ PixelProf @lemmy.ca Posts 3Comments 112Joined 2 yr. ago
For me, it's the next major milestone in what's been a roughly decade-ish trend of research, and the groundbreaking part is how rapidly it accelerated. We saw a similar boom in 2012-2018, and now it's just accelerating.
Before 2011/2012, if your network was too deep, too many layers, it would just breakdown and give pretty random results - it couldn't learn - so they had to perform relatively simple tasks. Then a few techniques were developed that enabled deep learning, the ability to really stretch the amount of patterns a network could learn if given enough data. Suddenly, things that were jokes in computer science became reality. The move from deep networks to 95% image recognition ability, for example, took about 1 years to halve the error rate, about 5 years to go from about 35-40% incorrect classification to 5%. That's the same stuff that powered all the hype around AI beating Go champions and professional Starcraft players.
The Transformer (the T in GPT) came out in 2017, around the peak of the deep learning boom. In 2 years, GPT-2 was released, and while it's funny to look back on now, it practically revolutionized temporal data coherence and showed that throwing lots of data at this architecture didn't break it, like previous ones had. Then they kept throwing more and more and more data, and it kept going and improving. With GPT-3 about a year later, like in 2012, we saw an immediate spike in previously impossible challenges being destroyed, and seemingly they haven't degraded with more data yet. While it's unsustainable, it's the same kind of puzzle piece that pushed deep learning into the forefront in 2012, and the same concepts are being applied to different domains like image generation, which has also seen massive boosts thanks in-part to the 2017 research.
Anyways, small rant, but yeah - it's hype lies in its historical context, for me. The chat bot is an incredible demonstration of the incredible underlying advancements to data processing that were made in the past decade, and if working out patterns from massive quantities of data is a pointless endeavour I have sad news for all folks with brains.
Windows 11 has tabbed file explorer, a package manager, it's quick, the interface looks nice and feels nice, and it's been really stable for me. I don't know where the complaints are at, it's been great. All they need to do is regress all of the ads-in-your-OS stuff from 10. Bring back the start menu that doesn't hang for 30 seconds looking something up online before showing you your installed programs.
I understand that he's placing these relative to quantum computing, and that he is specifically a scientist who is deeply invested in that realm, it just seems too reductionist from a software perspective, because ultimately yeah - we are indeed limited by the architecture of our physical computing paradigm, but that doesn't discount the incredible advancements we've made in the space.
Maybe I'm being too hyperbolic over this small article, but does this basically mean any advancements in CS research are basically just glorified (insert elementary mechanical thing here) because they use bits and von Neumann architecture?
I used to adore Kaku when I was young, but as I got into academics, saw how attached he was to string theory long after it's expiry date, and seeing how popular he got on pretty wild and speculative fiction, I struggle to take him too seriously in this realm.
My experience, which comes with years in labs working on creative computation, AI, and NLP, these large language models are impressive and revolutionary, but quite frankly, for dumb reasons. The transformer was a great advancement, but seemingly only if we piled obscene amounts of data on it, previously unspeculated of amounts. Now we can train smaller bots off of the data from these bigger ones, which is neat, but it's still that mass of data.
To the general public: Yes, LLMs are overblown. To someone who spent years researching creativity assistance AI and NLPs: These are freaking awesome, and I'm amazed at the capabilities we have now in creating code that can do qualitative analysis and natural language interfacing, but the model is unsustainable unless techniques like Orca come along and shrink down the data requirements. That said, I'm running pretty competent language and image models on 12GB of relatively cheap consumer video card, so we're progressing fast.
Edit to Add: And I do agree that we're going to see wild stuff with quantum computing one day, but that can't discount the excellent research being done by folks working with existing hardware, and it's upsetting to hear a scientist bawk at a field like that. And I recognize I led this by speaking down on string theory, but string theory pop science (including Dr. Kaku) caused havoc in people taking physics seriously.
This is the start of the use cases I wanted to see take off with Mastodon/Lemmy/Kbin. Much like the previous era of distributed content with user-hosted voice servers and forums, having larger communities/organizations run their own instances and avoid trying to treat the space as one big pool of content is the real use case here. The fact that you can cross-instance subscribe and post makes it viable long-term.
It also gives "free" verification of information's sources based on the domain, the same way that (modern) email gives you an extra layer of confidence when you see a verified domain. I would love the see the Government of Canada, CBC, Universities, all starting their own instances and utilizing them in unique and interesting ways. With enough adoption, official provincial/municipality instances could pop up to make organized communities easier.
It feels to me like a starting move away from the autocracy that the platform economy has created. It's not universal, but I absolutely push back against too many instances trying to be "general purpose Reddit replacements" because that seems like a fleeting use case for what it can eventually become, and it just confuses the whole abstraction of what these decentralized socials afford.
Poorly.
More seriously, I didn't know I had ADHD, but I'd kind of naturally contorted my world to support it as best as I could. I worked flexible, four months contracts. I only worked in low-stakes positions where leaving after a few months was expected. When I was a young kid, I was really good at convincing teachers that they didn't need to see my homework or that I needed an extra day, because even though the work was trivial, I wouldn't do it until the day after a deadline.
I've minimized obligations where I can, like autopay for every bill, I don't drive to avoid having to take it to the shop and do maintenance, I rent so that I'm not on the hook for maintenance, and I chronically overthink purchases to avoid impulse spending most of the time, at the sacrifice of not getting things I probably need.
I'm still working on it, but I think reducing the places where you can really mess things up on a bad brain day and doing what you can to nurture an environment where you can follow your rhythms is important. Way easier said than done, of course.
Soap. 100%.
Yeah, I'm not personally attributing malice, I've just felt like it's a bit of a show of power/absurdity of wealth that he can kind of accidentally buy one of the largest companies, mess with it for the lulz without caring about the consequences. It seems like the stupidity I would have done of I was a billionaire when I was a nihilistic teen high on chans boards.
I know this post and comment might sound shilly but switching to more expensive microfibre underwear actually made a big impact on my life and motivated me to start buying better fitting and better material clothes.
I'd always bought cheap and thought anything else was silly. I was wrong. So much more comfortable, I haven't had a single pair even begin to wear down a little bit, less sweating and feel cleaner, fit better, and haven't been scrunchy or uncomfortable once compared to the daily issues of that cheap FotL life. This led to more expensive and longer lasting socks with textures I like better, better fitting shoes that survive more than one season.
It was spawned by some severe weight loss and a need to restock my wardrobe. My old underwear stuck around as backups to tell me I needed to do laundry, but going back to the old ones was bad enough that I stopped postponing laundry.
Basically, I really didn't appreciate how much I absolutely hated so many textures I was constantly in contact with until I tried alternative underwear and realized you don't have to just deal with that all the time.
It depends what "From Scratch" means to you, as I don't know your level of programming or interests, because you could be talking about making a game from beginning to end, and you could be talking about...
- Using a general purpose game engine (Unity, Godot, Unreal) and pre-made assets (e.g., Unity Asset Store, Epic Marketplace)?
- Using a general purpose game engine almost purely as a rendering+input engine with a nice user interface and building your own engine overtop of that
- Using frameworks for user input and rendering images, but not necessarily ones built for games, so they're more general purpose and you'll need to write a lot of game code to put it all together into your own engine before you even starting "Making the game", but offer extreme control over every piece so that you can make something very strange and experimental, but lots of technical overhead before you get started
- Writing your own frameworks for handling user input and rendering images... that same as previous, but you'll spend 99% of your time trying to rewrite the wheel and get it to go as fast as any off the shelf replacement
If you're new to programming and just want to make a game, consider Godot with GDScript - here's a guide created in Godot to learn GDScript interactively with no programming experience. GDScript is like Python, a very widely used language outside of games, but it is exclusive to Godot so you'll need to transfer it. You can also use C# in Godot, but it's a bigger learning curve, though it is very general and used in a lot of games.
I'm a big Godot fan, but Unity and Unreal Engine are solid. Unreal might have a steeper learning curve, Godot is a free and open-source project with a nice community but it doesn't have the extensive userbase and forum repository of Unity and Unreal, Unity is so widely used there's lots of info out there.
If you did want to go really from scratch, you can try using something like Pygame in Python or Processing in Java, which are entirely code-created (no user interface) but offer lots of helpful functionality for making games purely from code. Very flexible. That said, they'll often run slow, they'll take more time to get started on a project, and you'll very quickly hit a ceiling for how much you can realistically do in them before anything practical.
If you want to go a bit lower, C++ with SDL2, learning OpenGL, and learning about how games are rendered and all that is great - it will be fast, and you'll learn the skills to modify Godot, Unreal, etc. to do anything you'd like, but similar caveats to previous; there's likely a low ceiling for the quality you'll be able to put out and high overhead to get started on a project.
My cynical guess is that's what they're hoping the community will do ("like lemmings, I tell you!" - spez, probably) to drive higher traffic numbers before some announcement or meeting.
Yeah, I really think it's important to not see Lemmy as one singular community, or a lot of important use cases will go ignored.
Not the OP, but in Canada at least, I think you would legally be expected to because common law is (as far as I'm aware) very nearly marriage and is entirely implied by time living together in a conjugal relationship. It might be provincial to determine the actual property laws, though.
I don't have a firm opinion here, but I think the key difference in your case is that a conjugal relationship has some expectation of... Oh I don't know, mutuality? A landlord tenant relationship is a lease agreement. If your roommate didn't sign any kind of lease agreement, they might have a legal case to just not pay you and suffer no consequences (I don't know), but they're not in a conjugal relationship, so there's also no implication of shared ownership.
Without signing lease agreement and being in a conjugal relationship, I think there is a pretty fair case that expecting shared ownership is a fair assumption.
That all said, it's also really up to the individuals to figure that out early, and the deception in the meme suggests that the agency to have that discussion wasn't available, and that's really the part I find problematic here.
That and expropriation/eminent domain, etc. Even if you pay your taxes, if the government needs it, they have processes to take it.
I'm not saying it's an inherently bad thing, but it's another one of those important things to realize is already present if anyone wants to argue for/against certain government reforms.
I certainly used to, and used to think it was essentially gender neutral, but again - in certain contexts like a male dominated classroom, the women/nb students could easily feel excluded by it. Outside of that, I also recognized my trans friends had a lot of thoughtless people intentionally misgendering them on the regular just to be mean, and finding small ways to reduce that reinforcement felt better than not. It was also surprisingly not that tough for me to adopt the more neutral language, so if it's a subtle help with no skin off my back it just seems very win-win.
I know it's controversial, but moving away from "guys" when I address a group and more or less defaulting to "they" when referring to people I don't know.
They was practical, because I deal with so many students exclusively via email, and the majority of them have foreign names where I'd never be able to place a gender anyways if they didn't state pronouns.
Switching away from guys was natural, but I'm in a very male dominated field and I'd heard from women students in my undergrad that they did feel just a bit excluded in a class setting (not as much social settings) when the professor addresses a room of 120 men and 5 women with "Guys", so it just more or less fell to the side in favour of folks/everyone.
Only when it's intentionally censored and trained to react in a particular way. When it's not, you remember it was trained on random internet content.
Getting really speculative, but maybe Infinite Scrolling and similar UX design patterns. I think we learned it was dangerous pretty early in, but I have a feeling there isn't currently a widespread understanding of just how badly things like infinite scrolling shortcircuit parts of the brain and cause issues with attention and time regulation in large populations.
If I was more researched on it, I might include infinite short-form content feeds of almost any type to be honest, which may just be another way of saying social media.
Conversely, if smart watches with accurate health monitoring become cheap and commonplace it might drastically improve health outcomes by motivating people to see doctor's when needed for subtle heart issues that would otherwise go unchecked.
"Where did you learn this skill?" "The olde rules prevent me from discussing this further."
Oh for sure. And it's a great realm to research, but pretty dirty to rip apart another field to bolster your own. Then again, string theorist...