Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)AB
Posts
0
Comments
1,096
Joined
2 yr. ago

  • Um - Apple's problems are very public.

    It was clear they had supply constraints a few years ago, and when those cleared up there was a huge bubble of sales. Expecting growth this year when so many regular customers just got a new phone would be silly.

    And it's also a distraction - the problems facing Apple are

    1. How poorly the company is responding to antitrust complaints.
    2. The Vision Pro doesn't seem to be doing well, and their car project was so much worse they literally killed it.
    3. Twelve years ago Apple was leading the industry on digital assistants... Siri was nowhere near good enough but nobody else had a "good enough" product either and Siri showed real promise. Now? WTF is taking so long? It's pretty clear other companies are very close to achieving what Siri failed and there's not much to indicate Apple can keep up.
  • Nilay Patel - the editor in chief is anti-AI especially when it comes to article content. He doesn't allow anyone at the company to use generated content except when they are writing an article about AI and even then only to demonstrate a point - e.g. "here's a comparison of two LLMs with the same prompt". It was also his decision to stop AI's from crawling any content on their website.

    He used AI to pad the article because that's what real spam articles do. It had nothing to do with acceptance.

  • they do some squirrely stuff to try to get you to buy a new toner cartridge early

    My Brother is newer than yours (the cheapest one I could get that prints on both sides of the paper), and has a setting to toggle how it behaves when toner is low.

    The default is to pause printing until you replace the toner - honestly that's not entirely wrong. Having the printer run out of toner half way through an important print job could be a disaster.

    The alternative mode is to just show a "low toner" warning badge whenever you print a document. That's what I use, but I also check if it printed properly before closing the document which a lot of people don't do. It looks like this:

    As far as I know it's just a simple counter - how many pages have you printed since it was replaced. Obviously that's never going to be particularly accurate.

  • What do you mean by "local"? If you mean finding somewhere to go for lunch or the opening hours of a store, I recommend using the maps app on your phone (I prefer Apple Maps over Google, because it uses Yelp and TripAdvisor for reviews which are accurate than Google reviews... if I had an Android phone I'd probably install Yelp/TripAdvisor).

  • I can usually find what I need on google pretty damn quick

    It depends what you're searching for. Some things are very hard to find that used to be easy.

    The solution I'd like to see is for Google to stop being anticompetitive. For example it just leaked that they pay half of their company wide profits to Apple in order to stop Apple from using (or creating) another search engine.

    Stop spending tens of billions of dollars per year trying to keep competition away, and instead invest all of that money into making Google Search a better product.

  • Because they are making so that we get less results that are just cheating the system to show up at the top?

    No, because they are failing to hide low quality search results. Something the would invest more money in if an alternative search engine existed.

    There are so many websites now that just shouldn't exist at all. And they wouldn't exist if Google didn't send tons of traffic their way.

  • ChatGPT 4 is a great assistant, I find it indispensable... I use it on my phone and computer but would like it in a dedicated device.

    Privacy? Yeah it's not great, but that's mitigated by OpenAI focusing the product hard on areas that don't really need privacy.

    I do think these tools can be private - but to get there we need more RAM on our computers and phones, and it needs to be expensive high bandwidth RAM, which costs a fortune right now. A lot of research is being done to reduce memory requirements and more manufacturing capacity for memory is being ramped up.

  • Hopes were set unreasonably high because the hardware designer has a great reputation. And the hardware seems well made (for the price) and certainly tries out some interesting new ideas. I love how the camera is physically blocked while not in use for example.

    The software team has let this product down. Not surprising, but dissapointing.

  • That eVinci reactor is tiny at only 5MW. You'd need something like a thousand of them to run a single AI data center. It's also horrifically expensive at over $100 million (each! multiply that by a thousand!) and it can only produce that amount of power for eight years, then I'm not sure what you do. Buy a thousand more of them?

    For comparison, some wind turbines provide more than twice as much power from just a single turbine. And they cost single digit millions to setup. They're not as reliable and they're also bigger than a micro nuclear reactor. But none of that really matters for a data center, which can draw power from the grid if it needs to.

    The only really promising small reactor I've heard of is the NuScale one - but it may have been vapourware. Republicans made a big splash during the 2016 election campaign and committed to paying 1/12th of the cost of a reactor as part of their clean energy "commitment". There was no price tag, just 1/12th.

    A couple years later, after they'd won the election, they quietly abandoned that plan and agreed to pay $1.3 billion which they claimed would be 1/4th of the budget. The subtext was the earlier election promise was before a budget had been figured out yet. But going from 1/12th to 1/4th is a pretty big jump.

    And then a few years after that... when the company told the government $1.3 billion would not be enough money for the project to be financially viable... and that in order to sell electricity at all they needed the government to subsidise every single watt of power produced by the plant for the entire period that it operated... because it was going to run at a loss... that's when the government pulled all funding (except what had already been spent, which was a lot of money) and the whole project collapsed.

    I tried to find references for all of that, but the website for the project is now a "domain for sale" page. All that's left is a few vague news articles which have conflicting information. But I've been following this for decades and the project you linked to was one of the ones that made it crystal clear to me that nuclear doesn't have a future unless something really big changes.

    Who knows, perhaps if the government had been really committed to NuScale, they might've pushed through the pain and helped it succeed int order to become cheaper later. But the government wasn't willing to take that risk and apparently nobody else was either.

  • need to be somewhat close to important population areas

    They really don't. I live in regional Australia - the nearest data center is 1300 miles away. It's perfectly fine. I work in tech and we had a small data center (50 servers) in our office with a data center grade fibre link - got rid of it because it was a waste of money. Even comparing 1300 miles of latency to 20 feet of latency wasn't worth it.

    To be clear, having 0.1ms of latency was noticeable for some things. But nothing that really matters. And certainly not AI where you're often waiting 5 seconds or even a full minute.

  • The S30 OS, before Nokia collapses, was much better.

    Yeah no - you're miss-remembering it. For example you had to delete SMS messages otherwise your mailbox would fill up.

    It could only fit 10 messages before it'd run out of space, and once full no messages would be received at all.

    Also, the battery life was ten days in standby if you didn't use the phone which was nice but as soon as you started using it... then it only lasted 3 hours. I used to carry two spare batteries in my bag... don't miss those days at all.

  • Apple Music is very good (and you don't need an iPhone). For me at least it recommends good songs, but even better you don't need to use those. There are extensive playlists that are manually curated by experts - for example Aussie Pub Rock is a hundred song playlist that is regularly updated by their team of editors.

    Audio quality also tends to be better on Apple Music. They encourage recording studios to produce a "Mastered for iTunes" mix and have strict quality controls as well as training for the recording studio to make sure they do a good job. You won't find anything amateur with that label but even for professional massive artists I think they sound better there too - because Mastered for iTunes tracks are intended to be listened to with relatively neutral speakers/headphones (the only kind Apple sells) while a lot of other services have professionally mixed sound tracks designed for bass heavy speakers that so many people have nowadays out side of Apple's walled garden. I find I often need to boost the base to get good sound from Spotify/YouTube Music/etc because they assume your speakers will do that for you.

    The difference isn't subtle - I'm not talking about a 256 vs 320 Kbps encoding difference. The same song from a major artist (e.g. Taylor Swift) will often sound totally different on Apple Music. Wether it sounds "better" depends on your speakers, but with my speakers (which are not from Apple), they do sound better. A lot better.

    But personally I've gone back to just buying music. The idea that I'll pay who knows what ever month for the rest of my life... no thanks.

    I'll jump on YouTube occasionally to discover new music, but i'm not paying for it (Apple Music, sadly, has no free tier... but it does have a free trial).

  • The long-term popularity of any given tool for software development is proportional to how much labour arbitrage it enables.

    Right. Because if you quote $700,000 to do a job in C/C++, and someone else quotes $70,000 to do the same job in JavaScript... no prizes for correctly guessing who wins the contract.

    But that's not the whole story. Where C really falls to shit is if you compare it to giving the JavaScript project $500,000. At that point, it's still far cheaper than C, but you can hire a 7x larger team. Hire twice as many coders and also give them a whole bunch of support staff (planning, quality assurance, user experience design, a healthy marketing budget...)

    JavaScript is absolutely a worse language than C/C++. But if you compare Visual Studio to Visual Studio Code (with a bunch of good plugins)... then there's no comparison VSCode is a far better IDE. And Visual Studio has been under active development since the mid 90's. VSCode has existed less than half that long and it has already eclipsed it, despite being backed by the same company, and despite that company being pretty heavily incentivised to prioritised the older product (which they sell for a handsome profit margin, while the upstart is given away for free).

    I learned C 23 years ago and learned JavaScript 18 years ago. In my entire life, I've written maybe 20,000 lines of C code where I was actually paid to write that code and I couldn't possibly estimate the number of lines of JavaScript. It'd be millions.

    I hate JavaScript. But it puts food on the table, so turn to it regularly.

    Large Language Models are a remarkable discovery that should, in the long term, tell us something interesting about the nature of text. They have some potentially productive uses. It’s destructive uses and the harm it represents, however, outweigh that usefulness by such a margin that, yes, I absolutely do think less of you for using them. (You can argue about productivity and “progress” all you like, but none of that will raise you back into my good opinion.)

    Yeah you're way off the mark. Earlier today I added this comment to my code:

    // remove categories that have no sales

    For context... above that comment was a fifty lines of relatively complex code to extract a month of sales data from several database tables, and summarise it down to a simple set of figures which can be used to generate a PDF report for archival/potential future auditing purposes. Boring business stuff that I'd rather not work on, but it has to be done.

    The database has a bunch of categories which aren't in use currently (e.g. seasonal products) and I'd been asked to remove them. I copy/pasted that comment from my issue tracker into the relevant function, hit enter, and got six lines of code. A simple map reduce function that I could've easily written in two minutes. The AI wrote it in a quarter of a second, and I spent one minute checking if it worked properly.

    That's not a "potential" productivity boost, it's a big one. Does that make me worse at my job? No - the opposite. I'm able to focus all of my attention on the advanced features of my project that separate it from the competition, without getting distracted much by all the boring shit that also has to be done.

    I've seen zero evidence of LLM authored code being destructive. Sure, it writes buggy code sometimes... but humans do that too. And anyone with experience in the industry knows it's easier to test code you didn't write... well guess what, these days I don't write a lot of my code. So I'm better equiped to catch the bugs in it.

  • Yeah I call bullshit on that. I get why they're investing money in it, but this is a moonshot and I'm sure they don't expect it to succeed.

    These data centers can be built almost anywhere in the world. And there are places with very predictable weather patterns making solar/wind/hydro/etc extremely cheap compared to nuclear.

    Nuclear power is so expensive, that it makes far more sense to build an entire solar farm and an entire wind farm, both capable of providing enough power to run the data center on their own in overcast conditions or moderate wind.

    If you pick a good location, that's lkely to work out to running off your own power 95% of the time and selling power to the grid something like 75% of the time. The 5% when you can't run off your own power... no wind at night is rare in a good location and almost unheard of in thick cloud cover, well you'd just draw power from the grid. Power produced by other data centers that are producing excess solar or wind power right now.

    In the extremely rare disruption where power wouldn't be available even from the grid... then you just shift your workload to another continent for an hour or so. Hardly anyone would notice an extra tenth of a second of latency.

    Maybe I'm wrong and nuclear power will be 10x cheaper one day. But so far it's heading the other direction, about 10x more expensive than it was just a decade ago, thanks to incidents like Fukushima and that tiny radioactive capsule lost in Western Australia proving current nuclear safety standards, even in some of the safest countries in the world, are just not good enough. Forcing the industry to take additional measures (additional costs) going forward.

  • the google cars few years ago had the boot occupied by big computers

    But those were prototypes. These days you can get an NVIDIA H100 - several inches long, a few inches wide, one inch thick. It has 80GB of memory running at 3.5TB/s and 26 teraflops of compute (for comparison, Tesla autopilot runs on a 2 teraflop GPU).

    The H100 is designed to be run in clusters, with eight GPUs on a single server, but I don't think you'd need that much compute. You'd have two or maybe three servers, with one GPU each, and they'd be doing the same workload (for redundancy).

    They're not cheap... you couldn't afford to put one in a Tesla that only drives 1 or 2 hours a day. But a car/truck that drives 20 hours a day? Yeah that's affordable.

  • inspect the inside and outside of the truck before and after each trip.

    This could easily be a full time job for a team of people who working an ordinary 9-5 job inspecting one truck after another all day, basically the way taxis and other car fleets are maintained.

    I'd argue that's an improvement over driving a truck. Truck mechanics are paid slightly better than truck drivers, and they work far better hours.

    Many of them can fix blown tire or a failed spark plug

    Trucks have 18 wheels. A tire doesn't have to be fixed immediately. And I can't remember the last time I encountered a failed spark plug... but even if it were to happen one cylinder being out of action will just reduce your horsepower by 12%. You'd fix it after delivering the cargo.

    But again, roadside mechanics are a thing. And they're paid even better than workshop mechanics.

    deter theft and vandalism by often sleeping in the truck

    Human truck drivers are only allowed to drive 60 hours a week. Which means for at least 108 hours a week, the truck is parked somewhere. A self driving truck would have no such limit, and would almost always park at a safe location.

  • This tech advances very slowly.

    Historically, anything that reduces cost of transporting goods has advanced extremely quickly. The best comparison, I think, is the shipping container.

    It took about ten years for shipping containers to go from an invention nobody had heard of to one that was being used in every major seaport in the world and about another ten years for virtually all shipping used that method.

    The New York docks for example, dramatically increased activity (as in, handled several times more cargo per day) while also reducing the workforce by two thirds. I think self driving trucks will do the same thing - companies/cities/highways that adopt AI will grow rapidly and any company/city/highway that doesn't support self driving trucks will suddenly stop being used almost entirely.

    Shipping containers were not a simple transition. New ships and new docks had to be built to take advantage of it. A lot of new trucks and trains were also built. Just 20 years to replace nearly all the infrastructure in one of the biggest and most important industries in the world.