From my experiences, I think most people know how to cook a few dishes. But many people only cook on holidays or special occasions. Otherwise, it's mostly boxed dinners, meal kits, frozen food, take-out, or drive-thru. A lot of people feel they don't have time to cook and clean afterwards. I really only started cooking when I became vegan.
I don't think Jon is far-left. Maybe a progressive liberal. AFAIK, he's not even a SocDem, much less a DemSoc. Alex Jones is just a grifter conspiracy theorist. The likes of Matt Walsh and Michael Knowles are full-on fascists, and I'd think the mirror of that would be an anarcho-socialist or communist.
I liked his from-home stuff during Covid, but never liked TDS with Trevor Noah. I'm not exactly sure why. Too silly, not "edgy," insightful, or hard hitting enough I guess. John Oliver's show is pretty good though.
Nah, Java is alright. All the old complicated "enterprise" community and frameworks gave it a bad reputation. It was designed to be an easier, less bloated C++ (in terms of features/programming paradigms). It's also executed fairly efficiently. Last time I checked, the same program written in C would typically take 2x the time to complete in Java; whereas it would take 200x the time to complete in Python. Here's some recent benchmarks: https://benchmarksgame-team.pages.debian.net/benchmarksgame/fastest/python3-java.html
Had similar late childhood. Put on probation for weed possession, then would get locked up for every minor infraction. A local police officer would follow me around whenever he saw me, make up excuses for pulling me over, and search my car. Got locked up for 6 months for being 30 minutes late to school once... because I got pulled over on the way to school. I'm guessing there was some kind of graft going on with the probation officers, judges, and detention centers, because they would give out such long sentences for such minor offenses.
Yeah, I think I mostly agree. Don't think material should be confiscated though, that could cause people to avoid official harm reduction resources. But, I wouldn't want to see private business, like gas stations, liquor stores, or "dispensaries" making profit from selling and pushing fentanyl, tranq, krokodil and stuff like that. I do think more drugs with low addiction and harm potential should be legalized such as shrooms, LSD, and probably most psychedelics.
All that being said all legalization and decriminalization must coincide with massive investment in addiction treatment, harm reduction, and probably housing. Ideally, the root causes of the drug epidemic should be addressed, such as poverty, lack of adequate healthcare such as therapy, people generally feeling hopeless because of their material conditions, etc.
I specialized in ML during grad school, but only recently got back into it and keeping up with the latest developments. Started working at a startup last year that uses some AI components (classification models, generative image models, nothing nearly as large as GPT though).
Pessimistic about the AGI timeline :) Though I will admit GPT caught me off guard. Never thought a model simply trained to predict the next word in a sequence of text would capable of what GPT is (that's all GPT does BTW, takes a sequence to text and predicts what the next token should be, repeatedly). I'm pessimistic because, AFAIK, there isn't really a ML/AI architecture or even a good theoretical foundation that could achieve AGI. Perhaps actual brain simulation could, but I'm guessing that is very inefficient. My wild-ass-guess is AGI in 20 years if interest and money stays consistent. Then ASI like a year after, because you could use the AGI to build ASI (the singularity concept). Then the ASI will turn us into blobs that cannot scream, because we won't have mouths :)
Correct, when you talk to GPT, it doesn't learn anything. If you're having a conversation with it, every time you press "send," it sends the entire conversation back to GPT, so within a conversation it can be corrected, but remembers nothing from the previous conversation. If a conversation becomes too long, it will also start forgetting stuff (GPT has a limited input length, called the context length). OpenAI does periodically update GPT, but yeah, each update is a finished product. They are very much not "open," but they probably don't do a full training between each update. They probably carefully do some sort of "fine-tuning" along with reinforcement-learning-with-human-feedback, and probably some more tricks to massage the model a bit while preventing catastrophic forgetting.
Oh yeah, the latency of signals in the human brain is much, much slower than the latency of semiconductors. Forgot about that. That further muddies the very rough estimates. Also, there are multiple instances of GPTs running, not sure how many. It's estimated that each instance "only" requires 128 GPUs during inference (responding to chat messages), as opposed to 25k gpus for training. During training, the model needs to process multiple training examples at the same time for various reasons, including to speed up training, so more GPUs are needed. You could also think of it as training multiple instances at the same time, but combining what's "learned" into a single model/neural network.
Why keep trans women without bottom surgery out of women's restrooms? I assume all women's restrooms have stalls, so no one would see it anyways. And why isn't it about keeping vulva out of men's restrooms?
Spending is definitely looks exponential at the moment:
Most breakthroughs have historically been made by university researchers, then put into use by corporations. Arguably, including most of the latest developments,. But university researchers were never going to get access to the $100 million in compute time to train something like GPT-4, lol.
The human brain has 100 trillion connections. GPT-4 has 1.76 trillion parameters (which are analogous to connections). It took 25k GPUs to train, so in theory, I guess it could be possible to train a human-like intelligence using 1.4 million GPUs. Transformers (the T in GPT) are not like human brains though. They "learn" once, then do not learn or add "memories" while they're being used. They can't really do things like planning either. There are algorithms for "lifelong learning" and planning, but I don't think they scale to such large models, datasets, or real-world environments. I think there needs to be a lot theoretical breakthroughs to make AGI possible, and I'm not sure if more money will help that much. I suppose AGI could be achieved by trial and error (i.e. trying ideas and testing if they work without mathematically proving if or how well they'd work) instead of rigorous theoretical work.
Hmm, looks like it would also mess up classification, recommendation, captioning, etc models using these images. Maybe image and duplicate search as well? Maybe could be used to get around automated copyright strikes?
Nah, Biden is pretty likeable, neutral, uncontroversial, and a well known name. Kamala Harris would likely perform worse, for example. I'm sure there are many better people the DNC could have promoted by giving screen time and stuff like that starting years ago, but it was much too late to start that just months before primaries. And I'm guessing Biden and his administration didn't want to step away.
Unfortunately, it looks like the DNC is currently grooming Gavin Newsom to run for president in '28, and he's extremely unlikable, IMO. And I'm not even sure there will be a real election in '28.
Meh, I've never watched a video of his, but seems like it's just a modern day game show/home makeover/dystopian feel-good story sort of thing. Neither good or bad. It's good that he helps some people, bad that the people and many, many more need help.
For the time being, most countries can get a younger population by letting in immigrants (who are statistically younger). Would probably result in a softer landing than otherwise.
Debt and money are make-believe anyway. Just tokens in a game we play called capitalism.
I'm not sure about the infrastructure claim. Generally, if infrastructure is used less, it requires less maintenance.
50% of all habital land is already being used by humans and being degraded, which doesn't seem sustainable, especially since so much of the world still lives in poverty. World population doubles every 61 years, so it seems like it would be nearly impossible to stay on the current trajectory for much longer.
Doesn't even have to be massive. In my area, I see a half acre lot listed for $500k. I'm not even in a particularly expensive area (kinda rural, but 20mi away from a somewhat expensive metro).
From my experiences, I think most people know how to cook a few dishes. But many people only cook on holidays or special occasions. Otherwise, it's mostly boxed dinners, meal kits, frozen food, take-out, or drive-thru. A lot of people feel they don't have time to cook and clean afterwards. I really only started cooking when I became vegan.