Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)AB
Posts
0
Comments
1,096
Joined
2 yr. ago

  • This is a Financial Times article, regurgitated by Ars Technica. The article isn't by a tech journalist, it's by a business journalist, and their definition of "AI" is a lot looser than what you're thinking of.

    I'm pretty sure they're talking about things that Apple is already doing not just on current hardware but even on hardware from a few years ago. For example the keyboard on iOS now uses pretty much the same technology as ChatGPT but scaled way way down to the point where "Tiny Language Model" would probably be more accurate. I wouldn't be surprised if the training data is as small as ten megabytes, compared to half a terabyte for ChatGPT.

    The model will learn that you say "Fuck Yeah!" to one person and "That is interesting, thanks for sharing it with me." to someone else. Very cool technology - but it's not AI. The keyboard really will suggest swear words now by the way - if you've used them previously in a similar context to the current one. The old algorithmic keyboard had hardcoded "do not swear, ever" logic.

  • If you really want long term savings... how about ditch your nicotine habit? That's what - $1,500 per year? $100,000 over your lifetime?

    Oh, and medical bills on top of that. Hard to estimate that, but it could be more.

    I can think of something better to spend $100,000 on. Slot machines at a Casino would be a better option for example.

  • Developers talk about files, classes, or code

    Huh? No. Good developers talk about test cases, business needs, and functional scenarios. You shouldn't write a line of code without first knowing what the code is intended to do.

    QA is the first and last step to writing good software and it should be evaluated throughout development to catch quality problems early.

    It sounds like the places you work at only use it as the last step, which is just wrong. Fix that. Where I work there's an expectation that pre-release QA never finds any problems, because they've already been identified and eliminated (or at least recognised and as accepted as a compromise) before it goes to QA.

  • Just because a driver has their hands on the wheel doesn't mean they're watching the road. They might be watching a movie.

    As for asking about number plates - that sounds like a distraction that would cause accidents rather than prevent them.

    For me these systems need to be really clear. Either the person is driving, in which case they are fully responsible for every crash, or the car is driving, in which case the car is fully responsible. There's no room for any grey area in the middle.

    In my opinion Tesla should be forced to refund anyone who was told their car has "full self driving". I'm OK with autopilot though, since the airplane and boat version of that feature has always pretty much been "just keep going in a straight line until a human disengages autopilot".

  • What works for me is only estimate short term work.

    100 points is an average of 5 points per day. I recommend limiting your estimates to maybe 3 points. Because even if you get 5 points of work done today... it probably won't be 5 points spent on what you planned to work on when you start the day. You need to plan for the unexpected and 2 points per day, every day, sounds about right to me.

    Also if you think a task is worth 2 points, but you're going to start work on that task tomorrow... then it's a total waste of time to spend 10 minutes estimating the task now. Between now and when you start on the task you might decide not to do it, or make changes to the scope that significantly increase (or reduce) the amount of work required. Stick to "I can get these tasks worth three points done today". Also try to split your project up into tasks that are 1 or 2 points (personally, I'd adjust your point system as well... a "point" is worth 30 minute at my company... and aim for 5 hours per day).

    When you're doing long term plans - don't get into specifics. That works for other industries but it does not work with ours - our work is inherently unpredictable.

    Rather than calculating an amount of time required - my long term plans are deadlines for certain milestones such as "ship a minimal viable product for QA/Testing" or "Finish the model for X" or "Setup performance metrics".

    Figure out what your budget is (say, $200,000 - or 2,000 points if you prefer), then split that budget proportionally among each milestone. Maybe your "performance metrics" milestone gets 100 points. That does not mean you think it will be 100 points of work. It means 100 points is the amount of time you are capable of spending on it right now. Assume it will be very different when you actually start work on it - regularly adjust points as you progress through the project. Maybe something is finished ahead of schedule and you decide to allocate extra points to something that could benefit from more work. Or maybe (more likely?) you're behind schedule and you decide to cut something from the to-do list (as in, move it to the backlog, or reduce the scope).

    Discussions around how much a project will cost should be less about the work required and more about how much you and the customer are both willing to allocate to the project. And if it feels like it can't be done within the budget, then you don't go ahead (for that you need to rely on experience). If the budget is over the work required - there's always more work you can do. You could spend ten years doing A/B testing of colors and font sizes if you have the budget for that. And you can also cut corners corners to reduce a 100 point task down to half a point. Be prepared to do that - because it's the only way to ship projects on time and on budget. Locking yourself into a detailed scope ahead of time doesn't allow enough room to compensate for unexpected challenges or mistakes or a critical worker's kid getting sick, leaving them unable to work in the last week before your deadline - all of these things and more will happen on every long term project. Long term detailed plans are worthless. It's a huge amount of work to create those plans and you won't follow the plan anyway.

  • Emergency collision avoidance brake systems are already known to trigger at random

    So? They also brake suddenly when a kid runs out onto the road in front of them. Anyone following should have enough of a space cushion to get hard on the brakes and avoid hitting them. Or at least avoid hitting them very hard.

    If you plough into them at speed, you'll push the stopped car forwards and kill the kid.

  • I'm confused. Have you seriously never encountered a human driver who goes slower than you'd like or doesn't merge into a space? Sometimes they don't even know the blinkers are on, or indicate left then swerve suddenly hard right.

    For me, especially as a motorbike rider, the Turquoise indicator would give me confidence that the car isn't going to do something monumentally stupid and get me killed.

    Just yesterday I pulled into an exit lane and the car that (was) in front of me suddenly swerved hard into into my lane to cut me off and stop me from illegally overtaking them. Luckily I hadn't overtaken then, even though they were driving well under the speed limit (even the exit lane had a higher speed limit than what they were doing). At the end of the exit lane they swerved back into their lane and nearly hit someone who had accelerated back up to the speed limit once the idiot slow driver had moved out of their way. Human drivers are far less predictable than robot ones.

  • its a fundamental flaw of modern capitalism

    I disagree. It's a fundamental flaw of the people running Tesla. They're not going to get away with this - there will be lawsuits and serious consequences which could easily have been avoided while still making both short term and long term profits.

  • so they’re effectively un-fireable.

    For me that's the biggest mistake your company is making.

    I'm all for giving someone an opportunity if you think they might be good, and I'd be cautious accepting an individual's rejection, no matter how strong, since they could be wrong for any number of reasons... however you have to be able to fire someone especially early on in their employment.

    If you've sponsored their visa, then (at least in my country) there are still ways to fire them. You just need to help them find a new more appropriate job - maybe even one inside the current company.

    If someone is capable of "completing" a task by paying someone else to do it... then perhaps they have potential as a manager for example. Delegating tasks to other people is a real skill - and apparently an area your company is lacking (the most important ability is knowing who to give your task to... and they put this guy on your team! Wtf). Obviously don't start as a manager but maybe put them on that career track. Make him an assistant to a manager for example.

  • Smartphones weren’t a new idea, Palm had been on it since the mid 90’s

    Apple shipped the Newton in 1993. Well before Palm. And long, long, before shipping the Newton they were talking about hand held computers. The idea that they copied Palm is ridiculous.

    Like Palm, the Newton wasn't good enough to achieve widespread market adoption (and Apple recognised that - killing it in 1998).

    Sure - iPhone wasn't the first pocket computer and it was a very obvious invention that companies all over the world had failed to pull off for decades. I think Microsoft was the closest - their Pocket PC that was pretty good and they had a massive decade long version almost rebuilt from scratch about to ship when the iPhone came out... But Apple beat them to it and Google followed close behind - reportedly Google's early hardware partners were planning to ship Windows on those devices but Microsoft lost out on the contract negotiations, Satya Nadela said they were just too slow - their hardware partners want to wait for them.

    Apple was first to ship a good pocket computer. That was real innovation. Real innovators are the ones that get it right, and being first (to get it right) matters because once it's done once everyone else can just copy your idea instead of wasting time developing and testing dead end solutions to hard problems. The early Android devices for example, looked more like the old Pocket PC or a Blackberry. They probably weren't good enough to be successful. They quickly copied ideas like the software keyboard from Apple, and quickly adopted Apple's open source technology like the WebKit rendering engine.

  • I’d say the UI was the biggest shakeup

    I'd say the biggest shakeup was the features Jobs pushed hard in the keynote.

    1. It was a cellphone. A good cell phone. Everyone had a cell phone and nearly everyone hated them. The blackberry was decent if all you did was send text messages and make phone calls, but it was rubbish at everything else. PocketPC and Symbian and other flip phones were even worse, though each specific model had a different set of feature trade offs (did you ever try writing an email on a small PocketPC device? You had to press tiny keys with an equally tiny stylus and text was almost impossible to read (or alternatively so large that you couldn't fit enough text. Larger ones were a good experience but they were way too big for most people. Even the iPhone was considered huge at the time (it was much bigger than a blackberry for example).

    1. It was an iPod. Everyone (who could afford one) owned an iPod and it sucked having two gadgets in their pocket all day and keeping two gadgets charged. That was the feature that made the iPhone a "must have" product. Combining your phone and music device was a massive improvement and an obvious one even if you weren't sure about the other stuff. Other phones could play music by then, but they were all still really terrible. I could only fit a single album on my Symbian phone and it took hours of stuffing around and reading manuals and installing buggy software to figure out how to load MP3s onto the device. Yuck.
    2. It was able to browse the internet. The real, full internet. Everyone working a desk job was used to doing that all day every day, but now it was possible to do it away from your desk. That was a huge deal and I think by far the most meaningful feature of the iPhone... except it was a product nobody had ever used before, so it couldn't be the only headline feature.
  • There were a lot of little things that aren't relevant today but were a big deal at the time. For example it had a web browser that actually worked to view the real internet, even though 99% of webpages were designed for screens the size of 30 iPhones.

    Today all webpages are designed to work well on small screens - but that never would have happened without Apple. Or at least it would have taken a lot longer to happen. They got enough people using the internet on a phone to force web developers to support small screens. That was a big achievement - even today it's a massive amount of work to design a webpage that works well with a mouse and with your thumbs. The tools we have now didn't exist back then, and before Mobile Safari there weren't any users of small screens anyway so why would anyone put in all that work?

    Phones with web browsers predated the iPhone. They were completely unusable.

  • Steve Jobs and a lot of the best people at Apple left the company in 1985. The company was taken over by idiots ("bozos" was Steve's preferred term).

    Steve (and all the people at NeXT) returned to Apple 12 years later. Officially Apple "bought" NeXT but for nearly half a billion dollars but in reality that was clever account keeping to satisfy investors and Apple was in fact on the brink of going bankrupt. They didn't have half a billion dollars. They didn't even have enough money to cover salaries of their employees. The people at NeXT took over and made it into what it is today and they refer to 1997 as the year that NeXT bought Apple.

    Both the Pippin and the Newton shipped several years after Steve and his core team left. They were products of the "Bozo" management team. Both were killed pretty much at the same time as Steve coming back. He killed a lot of other stupid products as well.

  • app bundles (which I still think are just fantastic) were a NeXT thing.

    App bundles were just a better implementation of resource forks, which were invented by Apple and pre-dated NeXT.

    (which of course became Apple, but they weren’t at the time)

    NeXT was founded by people who worked at Apple (not just Steve) and they were largely put in charge when they came back to Apple. I wouldn't call them separate companies. Just a weird moment in the history of the company. A lot like what just happened at OpenAI.

  • Sorry but that's bullshit. That would be like disregarding all the engineering that goes into developing a car, just because someone else invented the wheel.

    Sure - without that invention they couldn't exist - but real innovation isn't just the foundational features of the product. 99% of the work is in small refinements - for example about two hours a day my Mazda is a horrible car to drive because the sun catches the chrome logo on the steering wheel and blinds the driver. The newer models? They have a slightly different shape on the steering wheel that puts the shiny logo in the shade at that time of the day. It takes real work over decades to figure out tiny details like that. Most of the job is things that aren't obvious when you first have an idea to build a product.

    Someone else probably, probably millions of other people, likely had the idea long ago... the real innovator is the one that actually does the hard work to make it a product someone will actually want to use.

  • Apple is one of the companies behind the USB standard. There are other major companies (especially Intel) but they often make really stupid decisions and I don't think the world would be using USB today if it wasn't for Apple coming on board and doing some really awesome work. USB-C for example was designed by Apple. And Thunderbolt - another Intel project - was pretty much exclusive to Apple hardware... and it's rumoured that Apple pushed intel hard to make serious improvements such as using copper instead of fibre optic and including it modern USB standards (thunderbolt, if you don't know, is basically PCI-E over a USB cable - it works so much better than a regular USB connection the only drawback is it costs slightly more).

    They took KHTML, a niche rendering engine that nobody had heard of which didn't work for major websites... and made it into the foundation that backs every browser except FireFox.

    The ARM CPU architecture was technically an independent company, but Apple provided nearly all their funding in the early days, provided ongoing funding for decades before they did anything interesting, and ARM's founding CEO was an Apple employee.

    Most of the best programming languages in the world, especially modern ones but even some old ones that have been re-architected, depend on LLVM which, while it's an open source project, for many years was exclusively worked on by Apple (who hired the university student that started it as a side project and gave him an unlimited budget to make it what it is today).

    They figured out how to make touch screen phones work. It existed before, but it was shit - in particular typing was unusable and while it wasn't as good on the first iPhone as it is today it was Apple who was the first to find a way to make it "good enough" and that was some seriously innovative stuff. It looks like a tiny keyboard with touch buttons but that is not what's going on under the hood. It's far more complex.

    Going forward - the Vision Pro headset has some pretty awesome innovations.

    I could go on, but you get the picture. A really common theme is they took something that already existed (e.g. the mouse) and figured out how to actually make it good enough for people to adopt it. It takes a lot of R&D to develop something as comprehensive as, for example, the HIG:

    Could someone else have achieved those innovations? Sure. If ARM/Apple didn't do it... I'm sure someone else would have figured out how to make a fast processor that could run all day on a battery small enough to wear on your wrist. But with that and so many other things, Apple's work was critical (a lot of that was software, not hardware - for example technology like ARC was critical to reach acceptable levels of efficiency). Somebody else would have done it eventually, but I'd argue Apple made it happen decades earlier than it otherwise would have. And once they proved it could be done, others coped them. Which is awesome - as Steve Jobs loved to quote Picasso "good artists copy; great artists steal" and said they do it shamelessly and expect their competitors to do the same... as long as they don't steal branding. That's when Apple's legal team gets fired up - as they did with the early Samsung phones where everything, even the icons on the home screen which could have easily been unique, looked like an iPhone.

  • The fine will likely be a percentage of annual global turnover... since twitter doesn't make much money it won't be a huge fine relative to the amount of money Musk has, however it'll just make it that much harder to become profitable and the bad publicity will cause problems.

    Best case scenario is they comply with the law and maybe the X network will be good again. I wouldn't hold my breath on that though.