Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)AB
Posts
0
Comments
321
Joined
2 yr. ago

  • The critical feature of MYOB (and Xero), that's largely missing in other options, is integration with the Australian Taxation Office.

    You can easily enter all your business activities, then when it's tax time double check all your data and simply click a button to file it with the government.

    Aside from those two, the only options I know of are a lot more expensive and intended for use by full time accountants (employees or external contractors). MYOB doesn't work on Linux and Xero, which is web based, is sadly lacking features. MYOB also has a web based version but as far as I know it's even more basic than Xero.

  • This. Housing is definitely a human right and it is generally provided in Australia.

    Where it gets more complex is how much should housing cost and what quality of housing should people get for their money? For example can you afford a house to yourself, or do you need to live with other people and share the rent? Maybe even share a bedroom?

    Australia doesn't have a shortage of housing, what we have is a shortage of affordable housing. As in, some people aren't able to pay for the houses that they want to live in and they aren't willing to live in the ones that they can afford.

    Domestic violence is the leading cause of homelessness in Australia. Victims of that often do have a home but it's not a safe one, so they're actually better off on the street. With help, these victims can find a home (and help is available).

  • Facebook and Twitter have become an excellent opportunity and tool to get important causes in from of peoples’ eyes

    CNN/Fox are biased, for sure - but that's nothing compared to straight up lies pushed by large sections of the internet. And those lies tend to perform better than facts on algorithmic timelines that optimise for engagement. For example articles showing "proof" that covid-19 killed various celebrities who are, in fact, very much alive and healthy, with the clear intent to create fear among large sections of society. A tactic that seems to be far too effective.

    I think the world needs to go back to human moderation. Like we have on (well run) fediverse communities.

  • Hell, I regularly use all of my 32GB of memory

    On what operating system?

    I have 16GB on my Mac and half of it goes to a virtual machine. And I'm definitely a heavy user - five browser windows open with who knows how many tabs is pretty common. An IDE or even two, plus all sorts of other stuff, and a bunch of electron apps too.

    MacOS definitely uses "all of the memory", but often at least a few gigabytes (as in, almost half my memory aside from the VM) is dedicated to caching files on disk. And with a fast SSD that's not buying you much performance.

  • ... one of the tests here is editing an 8K video. That's not an every day use case.

    There are pro users that don't need anywhere near that much memory.

    For example QLab. It's definitely "pro" software - but it's just automation software and commonly used for tasks like sending a 20 character text string to another computer on the network when you hit a button... it can do more complex things but most of the time the cheapest Raspberry Pi has enough compute power (you can't run it, or anything like it, on Linux however).

    A MacBook Air would be useless, because it doesn't have HDMI, and that often is needed. Professionals don't want to use dongles.

    While most people running QLab won't be too budget sensitive... they might be buying six Macs that won't be used to do anything else ever*,... so since it only uses a few hundred megabytes of RAM why spend Apple's premium prices on 16GB?

    (* half of them will probably never even be used, since they'd be backups powered on and ready to swap in with a few seconds notice if the main one fails, which almost never happens)

  • Permanently Deleted

    Jump
  • Every community has rules about what content is on topic, and if you post something else it will be removed. That's not censorship.

    A government statement is a government statement. It is not news. A proper news organisation would, for example, fact check whatever statement the government made and consider if the reader should be given additional context - perhaps details the government might be omitting in order to increase their chances of being re-elected.

    On an issue as politically charged as this one, it's especially important for the full journalistic process to be followed. You're essentially attempting to post to the community as if you are a journalist yourself. But you're not... and even if you were there's no team of people fact checking what you wrote.

    There are communities where you can do that, but US News one one of those communities.

  • Do they store 32-bit integers as 16-bit internally or how does macOS magically only use half the RAM? Hint: it doesn’t.

    As a Mac programmer I can give you a real answer... there are three major differences... but before I go into those, almost all integers in native Mac apps are 64 bit. 128 bit is probably more common than 32.

    First of all Mac software generally doesn't use garbage collection. It uses "Automatic Reference Counting" which is far more efficient. Back when computers had kilobytes of RAM, reference counting was the standard with programmer painstakingly writing code to clear things from memory the moment it wasn't needed anymore. The automatic version of that is the same, except the compiler writes the code for you... and it tends to do an even better job than a human, since it doesn't get sloppy.

    Garbage collection, the norm on modern Windows and Linux code, frankly sucks. Code that, for example, reads a bunch of files on disk might store all of those files in RAM for for ten seconds even if it only needs one of them in RAM at a time. That burn be 20GB of memory and push all of your other apps out into swap. Yuck.

    Second, swap, while it's used less (due to reference counting), still isn't a "last resort" on Macs. Rather it's a best practice to use swap deliberately for memory that you know doesn't need to be super fast. A toolbar icon for example... you map the file into swap and then allow the kernel to decide if it should be copied into RAM or not. Chances are the toolbar doesn't change for minutes at a time or it might not even be visible on the screen at all - so even if you have several gigabytes of RAM available there's a good chance the kernel will kick that icon out of RAM.

    And before you say "toolbar icons are tiny" - they're not really. The tiny favicon for beehaw is 49kb as a compressed png... but to draw it quickly you might store it uncompressed in RAM. It's 192px square and 32 bit color so 192 x 192 x 32 = 1.1MB of RAM for just one favicon. Multiply that by enough browser tabs and... Ouch. Which is why Mac software would commonly have the favicon as a png on disk, map the file into swap, and decompress the png every time it needs to be drawn (the window manager will keep a cache of the window in GPU memory anyway, so it won't be redrawn often).

    Third, modern Macs have really fast flash memory for swap. So fast it's hard to actually measure it, talking single digit microseconds, which means you can read several thousand files off disk in the time it takes the LCD to refresh. If an app needs to read a hundred images off swap in order to draw to the screen... the user is not going to notice. It will be just as fast as if those images were in RAM.

    Sure, we all run a few apps that are poorly written - e.g. Microsoft Teams - but that doesn't matter if all your other software is efficient. Teams uses, what, 2GB? There will be plenty left for everything else.

    Of course, some people need more than 8GB. But Apple does sell laptops with up to 128GB of RAM for those users.

  • they also made the chip so it’s not much of a defence

    It's a pretty old chip though. They shipped it over a year ago and even that was mostly just an upgrade from LPDDR4 to LPDDR5. Which is a substantial upgrade, real world performance wise, but most of the engineering work would've been done by whoever makes the memory - not Apple's own chip design team who presumably were working on something else (I'd guess desktop/laptop chips, and those certainly do have USB-3).

    Apple certainly could have included USB-3 support in those chips... but three years ago there wasn't any pressing reason to that so and this year they've added support for the models with the most expensive camera. And anyone who cares about data transfer speeds will buy the one with the best camera.

  • Apple has always been an enemy of the free software community

    Apple is one of the largest contributors to open source software in the world and they've been a major contributor to open source since the early 1980's. Yes, they have closed source software too... but it's all built on an open foundation and they give a lot back to the open source community.

    LLVM for example, was a small project nobody had ever heard of in 2005, when Apple hired the university student who created it, gave him an essentially unlimited budget to hire a team of more people, and fast forward almost two decades it's by far the best compiler in the world used by both modern languages (Rust/Swift/etc) and old languages (C, JavaScript, Fortran...) and it's still not controlled in any way by Apple. The uni student they hired was Chris Lattner, he is still president of LLVM now even though he's moved on (currently CEO of an AI startup called Modular AI).

  • One of the features they highlighted is is the built in display has very similar specs to their 6K 32" professional display (which, by the way, costs more than this laptop). So when you're not working at your desk you'll still have a great display (and why are you buying a laptop unless you occasionally work away from your desk?)

    • Both have a peak brightness is 1600 nits (a Dell XPS will only do ~600 nits and that's brighter than most laptops).
    • Both have 100% P3 color gamut (Dell XPS only gets to 90% - so it just can't display some standard colors)
    • even though it's an LCD, black levels are better than a lot of OLED laptops
    • contrast is also excellent
    • 120hz refresh rate, which is better than their desktop display (that only runs at 60Hz. Same as the Dell XPS)
    • 245 dpi (again, slightly better than 218 dpi on the desktop display, although you sit further away from a desktop... Dell XPS is 169 dpi)

    I love Dell displays, I've got two on my desk. But even the Dell displays that cost thousands of dollars are not as good as Apple displays.

  • We'd thankfully just disconnected from Optus (not easy, we had a commercial fibre line) a few days before this outage. They've been getting progressively less reliable and outages had almost become routine.

  • GPT-4 is already kinda slow - it works best as a "conversational" tool where you ask follow up questions and clarify things that have already been said. That's painful when you have to wait 10 seconds for a response. I couldn't imagine it being useful if it was minutes.

  • There are comparable models to GPT 3.5 "Turbo", which is faster and 30x cheaper than GPT 4 (if you pay OpenAI's regular API prices).

    I suspect that's because GPT-4 needs 30x more memory than 3.5.

    I'm not aware of any other model that performs as well as GPT-4. In fact I suspect even 3.5 Turbo is the second best model.

  • To put some numbers on it - RAM runs at tens of gigabytes per second (bytes, not bits). High Bandwidth Memory runs at several hundred or sometimes terabytes per second (OpenAI is likely using the latter, and that memory isn't just expensive it's also supply constrained, so the prices are astronomically high right now).

    You can buy HBM, and you can use it as your main system RAM, but it's painfully expensive. The actual amount of bandwidth also scales linearly with with the amount of memory you buy as well. So a 500GB is 10x faster than 50GB - because it write to all of the chips simultaneously (and then read from all of them when you access the data back).

    It's pretty standard on high end GPUs these days. Apple also uses it on all their computers (if you buy a Mac with 64GB of RAM, it'll run at 800MB/s - which isn't quite as fast as a high end GPU but it's close and it is HBM). It's part of why Macs are so expensive (and also why the cheaper ones have very little RAM).