Europe’s world-first AI rules get final approval from lawmakers. Here’s what happens next
abhibeckert @ abhibeckert @lemmy.world Posts 0Comments 1,096Joined 2 yr. ago
there have been several reports by foreign media casting doubt on the veracity of Israel’s claims
If there's any truth to that, I hope UNRWA has compensated and apologised to the people they fired.
Someday the AI will get good, and I’ll want to chat with it securely.
GPT4 is pretty good now. I'm not convinced it will be secure until we can run it locally on our own hardware.
As soon as we can run it locally, I plan to do so. Even if it means using a GPT4 quality LLM when far better exists if I use a cloud service.
Sure it would be nice to have something that hallucinates less than GPT-4, but I kinda feel like striving for that is making perfect the enemy of good. I'd rather stick with GPT-4 quality, and focus on usability/speed/reliability/etc and let people keep working on the fancy theoretical stuff in the background as a lower priority.
A Steve Jobs said, Real Artists Ship. They don't keep working on it forever until they can't think of any more improvements. You'll never ship.
The habit of sending tokens right as they generate is a dumb sales gimmick
Seems like it would be trivial to just place tokens in a buffer on the server and send output to the client in say 1KB chunks (a TCP packet can't be much bigger than that anyway, and it needs a bit of space for routing metadata).
And if the entire output is less than 1KB... pad it out to that length. Pretty standard to do that anywhere you care about security... e.g. if you were to dump the password table databases... they're all 256 bits. That's obviously not the real length - most will be shorter, some will be longer. Whatever they are it's cryptographically expanded (or shortened) to 256.
We usually do not do multi-people multi year projects
Seriously - why not?
Say you're doing an experiment, wouldn't it be nice if someone else could repeat that experiment? Maybe in 3 years? in 30 years? in 3,000 years time? And maybe they could use your code instead of writing it themselves and possibly getting it wrong?
If something is worth doing, then it is worth doing properly.
Classes are cool, but they are not needed and often obscure clarity
I write code all day professionally. A lot of my code doesn't use classes. I agree they often "obscure clarity".
But sometimes they do the opposite - they make things crystal clear. It's important to know how to use classes and even more important to know when to use them. I guarantee some of the work you do could benefit from a few simple classes. They don't need to be complex - I wrote a class the earlier today that is only four lines of code. And yes, a class was apropriate.
Apple's official policy is you can't access that money as cash "except as required by law".
In other words, if UK law says you are entitled to the money, then Apple would know that, and you simply need to contact Apple and ask them to give the money to you, as cash (or a bank transfer).
They don't like doing it, they want you to spend the money with them, but in many countries they are required to by law. Because it legally is your money - Apple classifies it as being spent when you use it to buy something, not when you add to your balance.
Depending how the money was deposited in your account, you might not get all of it back. For example target often sells $100 Gift Vouchers for $80. Try to get them to give you $100 cash in that situation, and there's a good chance Apple will only give you $80 (Target didn't make a loss of $20 on that gift card, and neither will Apple). But it doesn't sound like that will affect you.
PS: Transmit is awesome. Been a user for decades. The Mac App Store version is more expensive (subscription pricing with a lot of the money going to Apple) and has less features (some important features are not allowed in App Store apps) - so yeah, don't buy it from the App Store.
Can anyone recommend good resources for learning programming
Honestly? No. The best resource is you. Ask questions. Get experience. Ask questions. Get experience. Repeat.
It's not enough to learn. You also have to do. And you really should learn by doing in this field.
First of all - fuck Python. I'm sure it's possible to write good code in that language, but it's not easy and it requires a lot of discipline. I don't mean to be mean to Python, it's a truly wonderful language, arguably one of the best languages, when used properly. But it sounds like you're not using it properly.
Pick a language that:
- Has static typing
- Does not do garbage collection
Static typing forces you to have more structure in your code. You can have that structure in a dynamic language but nobody ever does in practice and part of the reason is all of the libraries and third party code you interact assume you have dynamic typing as a crutch to quickly and easily solve hard to solve problems.
It's far better to actually solve those problems, rather than avoid them. You'll tend to create code where bugs are caught when you write the code instead of when someone else executes the code. That "you vs someone else" distinction is a MASSIVE time saver in practice. It took me about 20 years, but I have learned dynamic typing sucks. It's convenient, but it sucks.
For more info: https://hackernoon.com/i-finally-understand-static-vs-dynamic-typing-and-you-will-too-ad0c2bd0acc7
On garbage collection - it's a similar issue. It's really convenient to write code where "someone else" deals with all that memory management "garbage" for you but the reality is you should be thinking about memory when you write your code because, at it's heart, 99.999% of the code you write is in fact just moving memory around. Garbage collection is like "driving" a Tesla with autopilot active. You're not really driving at all. And you can do a better job if you grab that wheel and do it yourself.
I recommend starting with a manually memory managed language (like RUST) to learn how it works, and then from there you might try a language that does most of the work for you without completely taking it out of your hands (for example Swift, which has "automatic" memory management for common situations but it's not a garbage collector and in some edge cases you need to step in and take over... a bit like cruise control in a car if we're going to use that analogy.
It's getting harder these days to find a language that doesn't have garbage collection. The industry has gone decades thinking GC is a good idea and we just need one more fix, which we're working on, to fix that edge case where it fucks up... and then we find another edge case, and another, and another... it's a bit of a mess and entire papers have been written on the subject. But anyway some of the best and newest languages (Rust, Swift, etc) don't have Garbage Collection, which is nice (because writing code in C or Fortran sucks — I'm not recommending that).
That's enough for now. Just keep muddling about learning those languages first before trying to tackle bigger problems. Programming is a difficult task, just like a baby learns to sit up, then roll over, then crawl, then stand, then walk with assistance, then stumble around, then walk, then run, then ride a bicycle with three wheels, then a two wheel one with no pedals, then a bicycle with pedals, then a car after that...
You skipped all those steps and went straight to driving a car (with autopilot). To learn properly, you don't need to go all the way back to "sitting up and crawling", but you should maybe go back just a little bit. Figure out how to get code to run, at all, in a language like rust, get familiar with it.
After you've done that come back here and ask what's next. We can talk about SOLID, Test Driven Development, all the intricacies of project management in git, exceptions vs returning an error vs failing silently, and when to use third party code vs writing your own (phew boy that's a big one...).
But for now - just learn a lower level language. Programming is a bit like physics. You've got elements, and under that atoms, and under that... well I don't even know what's under that (you're the scientist not me). There are "layers" to programming and it's important to work at the right layer and it's also important to understand the layer below and above the one you're working at.
If Python is at layer x, then you really need to learn layer x-1 in order to be good at Python. You don't need to go all the way down - you can't go all the way down (how do magnets work?).
WAN throughput limit is nearly 1Gbps
In my experience, exactly 1Gbps. It has 1Gbps network ports, and it maintains that throughput even with "advanced buffer management" / etc enabled.
I'm sure it slows own if you have thousands of people using it, but OP isn't planning to do that and anyone who is should buy one with more than four LAN ports anyway. This is a $60 router. If you're working with thousands of people, you should spend more than that.
Start with a Ubiquiti EdgeRouter X. It's a tiny little box that's easily hidden away and forgotten about, with five Ethernet ports (one for the internet, four for your home). The web interface is extensive and has every feature you could ever want and thousands of other features you can safely ignore.
It does not do wifi - and that's fine. Because for wifi to work well, the antenna has to be in a central location where you probably don't want half a dozen ethernet cables, power supplies, etc etc.
You can use it with almost any wifi access point (or even a full wifi router, configured to not do any routing), but I recommen done of these: https://ui.com/us/en/wifi/flagship
They have five current models on that page but there are more:
- U6 Enterprise - designed to be used by several hundred people at the same time. Forget that one.
- U7 Pro - the latest flagship Wifi 7 model (you said you don't even care about wifi 6, so probably forget that too)
- U6 Pro - their previous Flaghsip, with Wifi 6. Probably overkill for you but worth considering
- U6 Long Range - basically the same device but with a physically larger antenna to extend the range over 2,000 feet under ideal conditions
- U6+ - a confusingly named cheaper variant that is also smaller. I would buy this one — not because it's cheaper, but because it's the smallest one.
They are all ceiling mounted. Ceiling mounts are the way to go. Put them in the middle of a large central room in your home. It will provide perfect 5Ghz coverage within your home and your devices will seamlessly switch to 2.4Ghz when you leave the home (it'll probably work on your entire back/front yard and maybe even a bit down the street... even if you don't buy the "Long Range" model.
If your house has walls (or floors) that make it a faraday cage, then you will need to buy more than one access point. Often only one is needed but they are designed to work with multiple if you require that (potentially thousands, these access points are used for football stadiums, music festivals, sky scrapers, etc).
If you can't drill a hole in your ceiling, then buy a thin (flat profile) white ethernet cable use 3M adhesive strips to attach it the cable and wifi access point to your ceiling, nobody will notice unless they look up. You might need to patch up the paint when you move out but ceiling paint is dirt cheap and very forgiving (because it's matte paint).
If you refuse to go with a ceiling mounted access point, Ubiquiti has wall mounted and bench top variants. But they're not as good - ceilings are usually made of thin flimsy material while walls are usually solid structures. That makes a big difference when it comes to real world wireless performance and reliability.
It's a bit more than your budget, but I'd argue it's money well spent. My EdgeRouter X and old Unifi access point are approaching 7 years old and they have never even been restarted except when we've had power failures or when I've moved house... totally worth the money. The only problem I ever had is about 5 years in I forgot the password and wanted to change a setting... I had to do a factory reset. No biggie.
But if that's too expensive, you should be able to find older models of the same hardware (especially predecessors to the U6+). Like I said, mine is 7 years old and working perfectly. I could see myself still using it in another 7 years - anything where I need really high performance is connected to the EdgeRouter X with an ethernet cable.
PS: one of the ethernet ports on your EdgeRouter X is a "PoE OUT" port. Plug your Unifi wifi access point into that port, and you can toss the power supply that came with the access point in a drawer or just the rubbish bin. The EdgeRouter X will provide power over the ethernet cable.
Note: some Ubiquiti hardware is garbage, and the company seems to be going downhill lately. But they still have excellent products
It sounds like you’re saying that they don’t.
Honestly, I think you're really arguing over the technical definition of "sell".
get popups on a fresh install of an Apple OS and on first launching certain apps that asks me outright if I want to send usage data to Apple
Yeah but do you know what data is being sent? Most people have no idea (you might, I'm just saying most people). My position is if people don't know what's in the data, then they aren't really agreeing to it with full knowledge.
Do you haven’t any evidence for this?
I've seen the data (from my own apps), and I can see how easy it is to link crash reports to users. Crash reports include a unique device identifier and also loads of information about the device the moment it crashed. It's trivial to compare all of that data to other data the app collects and find out which user the crash report belongs to.
I doubt that’s something Apple would be happy about.
I'm sure it's a violation of the terms of service, but developers violate those all the time and enforcement is almost unheard of. When Apple catches an app breaking the rules, they usually just tell the developer to stop. Damage is already done by then.
Have a listen to this to get an idea how widespread this is: https://subclub.com/episode/app-store-ethics-dark-patterns-and-rule-breakers-steve-p-young-app-masters
What is the primary book used
There isn't one. Most people don't learn this stuff by reading a book.
The best way to learn is by looking at actual assembly code, then research what each instruction does. I wouldn't start with actual compiler generated code. Being computer generated it's often quite messy and obviously undocumented. Best to start with easier to read code like the example I've included below — a simple "print Hello World" in CISC, then in RISC.
Notice CISC uses mov
, int
and xor
, while RISC uses mov
, ldr
, and svc
. You should look those up in a manual (plenty of free ones online) but in simple terms:
mov
: move memory from one place to another. RISC and CISC have the same instruction but but they're not identicalint
: means interrupt, essentially stop execution (for a moment) and hand execution over to other softwarexor
: modifies a value (an XOR operation)ldr
: is "load register", it loads a value from elsewhere in memorysvc
: means "supervisor call" which, is used in much the same way asint
. The code is asking the kernel to do something (once to write to stdout, and once to terminate the program).
section .data helloWorld db 'Hello World',0xa ; 'Hello World' string followed by a newline character section .text global _start _start: ; write(1, helloWorld, 13) mov eax, 4 ; system call number for sys_write mov ebx, 1 ; file descriptor 1 is stdout mov ecx, helloWorld ; pointer to the string to print mov edx, 13 ; length of the string to print int 0x80 ; call kernel ; exit(0) mov eax, 1 ; system call number for sys_exit xor ebx, ebx ; exit status 0 int 0x80 ; call kernel
.section .data helloWorld: .asciz "Hello World\n" .section .text .global _start _start: ; write(1, helloWorld, 13) mov r0, #1 ; file descriptor 1 is stdout ldr r1, =helloWorld ; pointer to the string to print mov r2, #13 ; length of the string to print mov r7, #4 ; system call number for sys_write svc 0 ; make system call ; exit(0) mov r0, #0 ; exit status 0 mov r7, #1 ; system call number for sys_exit svc 0 ; make system call
don’t put all your eggs in one basket
That's a good approach - but there's a better one. If at all possible stick to software that uses standard data formats and is able to interact with other software. For example Lemmy uses Markdown (a standard) and it can interact with other software (on the fediverse).
If we ever decide to stop using Lemmy, there's a good chance all of the valuable content we're writing — like this discussion — will live on in whatever other software we decide to switch to instead of Lemmy. Because being Markdown, it's easy to import, and being on the fediverse, it will be easy to transition to a replacement gradually over time with the new software and lemmy both being used at the same time during a potentially years long transition period.
Unfortunately I don't know of any (good) web browser that does that. It's certainly possible for bookmarks/tabs/settings/etc to be synced between browsers, but in general browsers only ever support once off imports, they never actively maintain a shared set of data between browsers.
But there's an out — extensions. For example I don't use the password manager built into my browser. I use a browser extension for passwords and my password manager has an extension for all browsers. Obviously as locked down as passwords need to be, I don't want my passwords accessible outside that app/those extensions, but it does have a good import/export feature and I have used it to test other password managers. I should really look for a good extension that manages bookmarks well and syncs them between browsers.
I were on Apple, I would be using Firefox
I dunno if that's true. There are some really good browsers on the Mac that I suspect don't run (or don't run well) on whatever operating system you do use. Access to awesome Mac only software is the reason I use a Mac, even though I don't particularly like the company Apple has become (they were a wonderful company 20 years ago in my opinion).
I've done iOS/Mac app development — Apple doesn't "sell" data to me, but they absolutely provide me with extensive user tracking data for free (well, for $99 per year, but that's effectively free). As far as I know they also provide data to other third parties, such as in the news app But app developers is the big one.
The data is anonymised, but I assure you it's very detailed. Detailed enough that some companies probably cross reference it with other tracking and are able to link the data they get from Apple to real people.
Thankfully the tracking is opt-in, but users are forced to make a choice and encouraged to enable tracking and I'd argue they really aren't being educated properly on what they're handing over before making a decision. I can't really blame Apple for that, who wants to spend hours learning how Apple's tracking methods work? But it is a fact that Apple does collect a lot of data and they do share it.
Personally I have spent hours doing that research and I'm not OK with what they track — I opt out. And while my own software does have some tracking, it's a lot less detailed than the tracking Apple does. It's just basic analytics (roughly how many users do I have and what country are they from?) and crash reporting which is (thankfully) rare with my software and therefore useless for any invasive tracking. The vast majority of people using my apps never experience a crash (and that's only possible because I track crashes).
Google might be the primary maintainer of Chromium, but they don't really control it. Literally hundreds of other companies and thousands if individual developers contribute to Chromium every day and if Google did something they don't like the engine would be forked in a heartbeat.
In fact it has been forked — thousands of times (according to GitHub). It's just none of those forks have gained much traction. If Google really messes things up, such as if they actually go ahead and remove cookies as they've threatened to do for years, then one or two of those forks will gain traction. Likely enough traction that Google themselves would struggle to keep up and could end up killing Blink and basing Chrome off one of the forks.
If you don't trust Google (I don't), then don't use Chrome. But I wouldn't write off all Chromium based browsers, some of them are awesome. And the main problem it used to have (battery life) isn't an issue anymore. My M1 MacBook Air lasts forever on battery power and I always have a chromium based browser running.
With respect, I disagree. Rendering pages quickly and reliably is table stakes and all modern browsers do a great job of that. It doesn't really matter at all what rendering engine is under the hood as long as it works well.
I'm glad we have three rendering engines, especially since the largest two are backed by companies who don't always do what's right for the web... but three is enough. More than that would honestly be a waste of effort, I prefer the current situation with hundreds of browsers who pool resources and work together on a rendering engine that is shared by other browsers.
What really separates one browser from another is the toolbars and other user interface elements around the webpage. And Blink/WebKit/Gecko don't provide any of that.
Way back in the day, the best browser was OmniWeb. It was truly awesome but quite expensive (I think a license was about $60?). Unfortunately they didn't have the resources to keep up as CSS/JavaScript became more complex. It still worked for the vast majority of websites when they gave up on development, but the writing was on the wall and they weren't selling enough licenses to hire a large team. Also back then the only open source browser was FireFox and it's always been a really complex rendering engine to work with (there's a reason everybody uses Blink or WebKit as the foundation for their browser).
As far as I know, OmniWeb is the only (major) browser that was exclusively designed for the Mac (and NeXT before that). Even Safari historically ran on Windows and the current version borrows quite a lot of UI conventions from the iOS version. OmniWeb was a proper Mac browser. In fact back in the early days of Mac OS X OmniWeb wasn't just the best Mac Browser, it was arguably the best Mac App in general. They'd been working on it for decades when other Mac apps were either brand Cocoa apps or else still using Carbon (the compatibility layer between MacOS 9 and MacOS X).
OmniWeb is kinda-sorta-alive as a side project, using WebKit now instead of their proprietary engine, and the latest "test build" was released just a couple months ago. But the last stable/officially supported version of OmniWeb 5 shipped twelve years ago. It's somewhat dated now, for example the URL bar is the full width of the window and you can't change that - a hold back from the days when even desktop computer screens were only 800 pixels wide or even less. https://omnistaging.omnigroup.com/omniweb/
One of the early developers of OmniWeb (retired a long time ago) once claimed OmniWeb is older than World Wide Web (generally recognised as the first ever web browser) but given the internet didn't exist back then he wasn't able to point to any strong evidence. Wikipedia lists 1995 as the release date for OmniWeb, however he said that date is wrong and it was distributed years earlier (obviously not on the web — there was no other web browser so you had to get it some other way).
These days, I think the best web browser (and therefore also the best Mac browser) is Arc. It's not exclusively a Mac app, but it is written in SwiftUI and the iOS/Windows versions are quite different - Arc respects platform specific UI conventions and different use cases (especially on a phone).
Hers's a link to download it: https://arc.net/gift/70d85b6 (unfortunately you do need to sign up with an email account, since Arc is "software as a service" and (like OmniWeb did) they eventually plan to start charging for certain features. I'm OK with that personally, you do need an account to sync tabs between devices which I see as a must have feature).
A couple corrections:
- China also blocks TikTok (I shit you not)
- The US isn't "blocking" TikTok, they are forcing the parent company to sell it
If they refuse to sell, then sure the US will follow through with a block... but that's not the intention. I guess the question is how much does Bytedance care about their US market? The US is TikTok's largest market, but it's still only about 5% of TikTok users. There are almost as many Indonesian users, and Brazil isn't far behind. Plus Mexico, Russia, Vietnam, Phillipines...
And some of those countries might not want a US company to control TikTok.
Zuckerberg has said he doesn't think it's possible for any social network to operate (with significant marketshare) in every country, which is why he's interested in the Fediverse. If there has to be a wide ecosystem of social networks, then users should be able to access content posted to other networks.
Permanently Deleted
Uh, no they're not. They have the core operating system.
The only real difference is the security model (as you say, tightly locked down), but MacOS has been gradually adopting a lot of that over time. For example / used to be an ordinary volume - these days it's mounted read only and can't be written to even with sudo. iOS has always been like that.
They are different operating systems, but only because it's easier to make a change on one of them, then port that change to the other one later. Possibly years later. In general, they're pretty close. The main difference is the hardware, not the operating system.
Permanently Deleted
The APIs are similar but the hardware requires a different appraoch.
For example touch screen input is very different to mouse input - you need to decifer imprecise user input... and then provide precise input to webpages that are designed assuming the user has a mouse. There are touch APIs on the web, but developers tend not to use those APIs because dealing with imprecise input sucks. For example press a link with your thumb, it will highlight. Lift your thumb, it will go to the link. But if you press, then move your thumb, then release... instead of clicking the link it scrolls the page. Unless you move only a little bit - then it does click...
And the only way to get "all day" battery life out of a 10Wh battery is by keeping the CPU powered off most of the day. Figuring out how to maintain the current state of the webpage, so it can be restored if the CPU is powered off and back on again, without breaking things like JavaScript timers/etc.
FireFox has solved those issues (and others) on Android. But while Android has similar hardware, that operating system is nothing like iOS.
All the work to get Gecko working on Android made sense back int he day, when Android didn't have a good rendering engine. It would have also made sense back in the early days of the iPhone when WebKit was nowhere near as good as it is now. But today, when someone else has already figured out solutions to every problem? Is it worth reinventing all those wheels?
Permanently Deleted
Apple does allow other engines in Europe. Wether or not FireFox chooses to create one remains to be seen.
There's nothing wrong with WebKit, so not much incentive for FireFox to do all that work.
Permanently Deleted
30% of increase in daily installs ≠ 30% increase in users.
Yeah the lemmy headline is poorly written (the source article is pretty clear).
Still 30% is a substantial jump and will eventually turn into a bunch more money for FireFox - a good thing if you ask me.
If I my sandwich shop sells 30% more sandwiches one day, that doesn’t mean I’m certain to make 30% more money at the end of the year. I might make more, I might make less.
It costs money to make sandwiches. Mozilla doesn't even pay for bandwidth (Apple has that covered) - so the FireFox iOS app essentially only has overheads. Which means more users will be pure profit.
My understanding is China's rules are pretty wide open and effectively boil down to "if we don't like your use of AI, we will shut you down"... which isn't really much of a law. China would've done exactly the same thing without that law. There is some stuff in there about oversight/etc but that's about it.
Most of China's AI legislation is actually focused on encouraging companies to invest in AI, it's not really about regulating it. The US also has a bunch of AI laws in the same vein as China.
The EU legislation is much more specific and specifically prohibits AI in a bunch of specific situations. For example they have made it illegal to use AI for face recognition in "public spaces".