What helps these machines are built-in SSDs that operate at about 2 GB/s. If swapping out 2 GB of background tabs you're not looking at when you switch to your IDE takes a second, you're not really going to notice it. Only if you're actually trying to operate with all the memory at the same time (big Kubernetes test suites or something) is when the swapping becomes noticeable.
They sell access to data (i.e., ads) - that is far more lucrative than selling the data itself. Only companies that are bad at tech just sell the data (credit card companies, retail, etc)
Cambridge Analytica was far more stupid - that was them just giving away data for free. Their old Facebook Apps APIs were wide open to collect whatever for free for anyone who would use your app (CA made those "do this fun quiz and invite your friends!" kind of FB games) and the APIs just said "we require you to delete this data when the user is done with the app" with no way to enforce it
The biggest spikes look like the correspond to new year. So my guess is that the spikes are vacations and show the difference between home PC and office PC usage.
You can see the same spikes on e.g. Googles IPv6 chart - when people are away from work IPv6 penetration goes up, when people are at work it goes down.
Slow charging speeds at home/work are fine, nobody is burning 100% of their range daily on their commute. The people with 200 mile daily commutes are not buying EVs
Be careful in trying to interpret year over year statistics. Last year was huge for Apple as if you look at Q3 2022 then Apple increased sales 10% while the rest of the PC market dropped a massive 18%.
You're saying "since switching from x86 to ARM apples sales are down! see it was a bad idea!" but actually they have been way way up and are just finally getting inline with the sales decline the rest of the PC industry has had after the covid work from home rush ended.
the CPU architecture is not as directly tied to the software as it once was
Yeah it used to be that emulating anything all would be slow as balls. These days, as long as you have a native browser you're halfway there, then 90% of native software will emulate without the user noticing since it doesn't need much power at all, and you just need to entice stuff that really needs power (Photoshop etc), half of which is already ARM-ready since it supports Macs.
The big wrench in switching to ARM will be games. Game developers are very stubborn, see how all games stopped working on Mac when Apple dropped 32-bit support, even though no Macs have been 32-bit for a decade.
Where I lived before, the city had the municipal power company build the open-access fiber network. They already have all the right of way and lines right up to people's houses so perfectly suited.
You didn't realize? It seemed to me that the adults wouldn't SHUT UP about how oh you better enjoy this life while you got it because once you grow up life is going to suck!
I'm already seeing people come into software dev support forums asking "ChatGPT said you could do this but it's not compiling" and people replying that no, that's not possible and them arguing about it because ChatGPT said it.
Once Elon Musk unleashes his "uncensored" AI chat bot, we're going to be flooded with made-up misinformation, it's going to be a bloodbath.
It's such a Microsoft/IBM format. "Let's use this structural wrapper format! And then just define a format inside one gigantic chunk inside it!"
When Apple had already created AIFF years before and actually used the structure of the wrapper to implement the metadata. And also adopted an open structural format that already existed on Amiga.
What helps these machines are built-in SSDs that operate at about 2 GB/s. If swapping out 2 GB of background tabs you're not looking at when you switch to your IDE takes a second, you're not really going to notice it. Only if you're actually trying to operate with all the memory at the same time (big Kubernetes test suites or something) is when the swapping becomes noticeable.