Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)AR
Posts
9
Comments
976
Joined
2 yr. ago

Permanently Deleted

Jump
  • If right wing (or even other leftist groups) came into an explicitly tankie community and started arguing with people how would you react?

    Also do you actually know what tankies are? They aren't the majority in any country I know of. Non-tankie doesn't even mean right wing. Anarchists are further left than tankies.

  • Permanently Deleted

    Jump
  • I don't think partisan is even the right word here as many Lemmy users are too far left for mainstream political parties. In fact I am further left than most any mainstream party, but am still considered a capitalist shill by people here.

  • Permanently Deleted

    Jump
  • I don't think anti-tankies can be blamed when said tankies regularly engage in brigading of other instances. Like if everyone actually behaved this wouldn't have been an issue.

  • I've tried making this argument before and people never seem to agree. I think Google claims their Kubernetes is actually more secure than traditional VMs, but how true that really is I have no idea. Unfortunately though there are already things we depend upon for security that are probably less secure than most container platforms, like ordinary unix permissions or technologies like AppArmour and SELinux.

  • Did back propagation even exist in the 60s? That was a pretty fundamental change in what they do.

    If we are arguing about really fundamental changes then arguably any neural network is the same and humans are the same as ChatGPT or a mouse, or even something simpler like a single layer perceptron.

  • Permanently Deleted

    Jump
  • I know, I have used them. It's actually my job to do research with those kinds of models. They aren't nearly as powerful as current OpenAI's GPT-4o or their latest models.

  • Permanently Deleted

    Jump
  • I think he's talking about people using LLMs for illegal and unethical activities such as fishing. There are already a lot of people using LLMs that are open source without ethics restrictions to do bad stuff, with the power of GPT4 behind them they would be a lot more effective.

  • Permanently Deleted

    Jump
  • That's not true though. The models themselves are hella intensive to train. We already have open source programs to run LLMs at home, but they are limited to smaller open-weights models. Having a full ChatGPT model that can be run by any service provider or home server enthusiast would be a boon. It would certainly make my research more effective.

  • There is a lot that can be discussed in a philosophical debate. However, any 8 years old would be able to count how many letters are in a word. LLMs can’t reliably do that by virtue of how they work. This suggests me that it’s not just a model/training difference. Also evolution over million of years improved the “hardware” and the genetic material. Neither of this is compares to computing power or amount of data which is used to train LLMs.

    Actually humans have more computing power than is required to run an LLM. You have this backwards. LLMs are comparably a lot more efficient given how little computing power they need to run by comparison. Human brains as a piece of hardware are insanely high performance and energy efficient. I mean they include their own internal combustion engines and maintenance and security crew for fuck's sake. Give me a human built computer that has that.

    Anyway, time will tell. Personally I think it’s possible to reach a general AI eventually, I simply don’t think the LLMs approach is the one leading there.

    I agree here. I do think though that LLMs are closer than you think. They do in fact have both attention and working memory, which is a large step forward. The fact they can only process one medium (only text) is a serious limitation though. Presumably a general purpose AI would ideally have the ability to process visual input, auditory input, text, and some other stuff like various sensor types. There are other model types though, some of which take in multi-modal input to make decisions like a self-driving car.

    I think a lot of people romanticize what humans are capable of while dismissing what machines can do. Especially with the processing power and efficiency limitations that come with the simple silicon based processors that current machines are made from.

  • No actually it has changed pretty fundamentally. These aren't simply a bunch of FCNs put together. Look up what a transformer is, that was one of the major breakthroughs that made modern LLMs possible.

  • Exactly this. Things have already changed and are changing as more and more people learn how and where to use these technologies. I have seen even teachers use this stuff who have limited grasp of technology in general.

  • MIMO improves throughput if you have an Internet link it can saturate; realistically even a midrange 2x2 802.11AC router will provide more wifi bandwidth than your internet does.

    And that's where the fat controller says you are wrong. I have 1000 Mbps down. I've yet to actually hit that speed with WiFi 6.

    Also newer WiFi standards significantly improve latency. That's nothing to do with having more antennas though you would be correct there.

    The meme is correct. A $6 ethernet cable beats any and all wifi routers and client adapters, and always will.

    With current technology you would be correct. But as for the always part: Ethernet is an electrical signal, so it's actually slower than microwave signals used by WiFi, and the WiFi signals can also take a more direct path. So in the future WiFi or LiFi could in fact be faster. It's the processing delay, and scheduling that makes WiFi have higher latency. Not the physical medium.

    Before you say this is all academic because of the small distances involved I would remind you that propagation delay is actually a large issue in current microelectronics and computers. Sometimes parts of the same chip are far enough apart to create problems for the engineers due to the high clockspeeds of modern devices.