VPN used for VR game cheat sells access to your home network
kevincox @ kevincox @lemmy.ml Posts 6Comments 795Joined 4 yr. ago

it’s mostly solved already
I wished I believe this. Or I guess I agree that it is solved in most software but there is lots of commonly used software where it isn't. One broken bit of software can fairly easily take down a whole site or OS.
Try to create an event in 2040 in your favourite calendar. There is a decent chance it isn't supported. I would say most calendar servers support it, but the frontends often don't or vice-versa.
As with most of these things it is pricing based on value.
- Contractor is often fixing or building and cares a lot about the price.
- Most other purchases are during renovations so a luxury expense and relatively speaking the faucet will be a small part of that, so it is easy to milk these people for money.
require a separate device that looks like a calculator to use online banking
To be fair this actually provides a very high level of security? At least in my experience with AIB (in Ireland) you needed to enter the amount of the transactions and some other core details (maybe part of the recipient's account number? can't quite recall). Then you entered your PIN. This signed the transaction which provides very strong verification that you (via the PIN) authorize the specific transaction via a trusted device that is very unlikely to be compromised (unless you give someone physical access to it).
It is obviously quite inconvenient. But provides a huge level of security. Unlike this Safety Net crap which is currently quite easy to bypass.
which is supposed to enforce to run apps in secured phones
The point of the Google Play Integrity API is to ensure that the user is not in control of their phone, but that one of a small number of megacorps are in control.
Can the user pull their data out of apps? Not acceptable. Can the user access the app file itself? Not acceptable. Can the user modify apps? Not acceptable.
Basically it ensures that the user has no control over their own computing.
I wouldn't call a nail hard to use because I don't have a hammer. Yes, you need the right hardware, but there is no difference in the difficulty. But I understand what you are trying to say, just wanted to clarify that it wasn't hard, just not widespread yet.
which is hard to decode using hardware acceleration
This is a little misleading. There is nothing fundamental about AV1 that makes it hard to decode, support is just not widespread yet (mostly because it is a relatively new codec).
Just to be clear it is probably a good thing that YouTube re-encodes all videos. Videos are a highly complex format and decoders are prone to security vulnerabilities. By transcoding everything (in a controlled sandbox) YouTube takes most of this risk on and makes it highly unlikely that the resulting video that they serve to the general public is able to exploit any bugs in decoders.
Plus YouTube serves videos in a variety of formats and resolutions (and now different bitrates within a resolution). So even if they did try to preserve the original encoding where possible you wouldn't get it most of the time because there is a better match for your device.
From my experience it doesn't matter if there is an "Enhanced Bitrate" option or not. My assumption is that around the time that they added this option they dropped the regular 1080p bitrate for all videos. However they likely didn't eagerly re-encode old videos. So old videos still look OK for "1080p" but newer videos look trash whether or not the "1080p Enhanced Bitrate" option is available.
It may be worth right-clicking the video and choosing "Stats for Nerds" this will show you the video codec being used. For me 1080p is typically VP9 while 4k is usually AV1. Since AV1 is a newer codec it is quite likely that you don't have hardware decoding support.
I'm pretty sure that YouTube has been compressing videos harder in general. This loosely correlates with their release of the "1080p Enhanced Bitrate" option. But even 4k videos seem to have gotten worse to my eyes.
Watching a higher resolution is definitely a valid strategy. Optimal video compression is very complicated and while compressing at the native resolution is more efficient you can only go so far with less bits. Since the higher resolution versions have higher bitrates they just fundamentally have more data available and will give an overall better picture. If you are worried about possible fuzziness you can try using 4k rather than 1440p as it is a clean doubling of 1080p so you won't lose any crisp edges.
Your Firefox install contains a file called omni.ja
. For example on many Linux machines it will be at /usr/lib/firefox/browser/omni.ja
. This file is a ZIP archive and contains your places.xhtml
as well as other browser files. The exact paths are not always obvious as there is some remapping taking place (see the .manifest
files in the archive) but I think the vast majority of chrome://
paths come from this archive.
Permanently Deleted
Most particularly they generally pretend that nothing on the web is encrypted whereas in practice HTTPS is nearly universal at this point.
Permanently Deleted
The use case will change everything. OP is likely using much more memory than you are (especially disk cache usage) so the kernel decided to swap out some data. Maybe you aren't using as much so it has no need.
Permanently Deleted
To put it another way you want to be using all of your RAM and swap. It becomes a problem if you are frequently reading from Swap. (Writing isn't usually as much of an issue as they may be proactive writes in case more memory needs to be filled up).
Basically a perfect OS would use RAM + Swap such that the least disk reads need to be issued. This can mean swapping out some idle anonymous memory so that the space can be used as disk cache for some hotter data.
In this screenshot the OS decided that it was better to swap out 3GiB of something to use that space for the disk cache ("Cached" ). It is likely right about this decision (but is not always).
3 GiB does seem a bit high. But if you have lots of processes running that are using memory but are mostly idle it could definitely happen. For example in my case I often have lots of Language Servers running in my IDE, but many of them are for projects that I am not actively looking at so they are just waiting for something to happen. These often take lots of memory and it may make sense to swap these out until they are used again.
- Launching Steam games outside of Steam can be very difficult. Some games outright won't allow it.
- Steam provides native libraries such as the overlay, networking and matchmaking tools, achievements... You need to have Windows versions of these which wouldn't be distributed by default in the Linux version of Steam.
- In the past Steam just didn't run under Linux, so you had no other option.
There is an option in settings to allow trying all games. By default it only allows it for tested and verified games. But it is a simple checkbox then you can download and run any Windows game.
It used to be common and useful. I did this even after Valve shipped a native Linux TF2 as at the beginning the Wine method gave better results on my hardware. But that time has long passed as Valve has integrated Wine (Proton) and in almost all cases the Linux native builds will outperform Wine (and Steam will let you use the Windows version via Proton if you want even if there is a native Linux build).
So while I suspect that there are still a few people doing this out of momentum, habit or reading old tutorials I am not aware of any good reasons to do this anymore.
Warning
Never extract archives from untrusted sources without prior inspection. It is possible that files are created outside of path, e.g. members that have absolute filenames starting with "/" or filenames with two dots "..".
https://docs.python.org/3/library/tarfile.html#tarfile.TarFile.extractall
I would be careful if using this as a general purpose tool.
A better alternative would likely be to use the regular command-line tools which have been hardened to this type of thing (and are likely much faster) and then just inspect the result. Always create a wrapper directory, then if the result is only one directory inside of that move it out, otherwise just keep the wrapper. I would recommend that the other updates their tool to do this rather than the current approach.
It honestly sounds more like someone convincing you that crypto is great than someone convincing you that Greenpeace is great.
"Residential IPs" are quite valuable for web scraping. Many scraping prevention tools and services use the source IP as the primary metric. If you come from a public cloud provider like AWS, GCP or DigitalOcean you get blocked 99% of the time. If you come from a US residential ISP then you get much more relaxed screening.