why are incels frowned/hated upon?
BorgDrone @ BorgDrone @lemmy.one Posts 1Comments 970Joined 2 yr. ago
Replace the walking with bicycling and that would be everything (the Netherlands, so cycling is the default mode of transportation), except the mall, we don’t do malls.
Can you give me real world applications of any of the things you mention, that people who shoot using Android find lacking versus iPhone?
This is not about users, this is about developers. What a user will notice is that the Android version of certain apps don't work as well as the iOS version. We put a lot of effort into getting an acceptable user experience on Android, but there is only so much you can do. For example: on one specific image processing pipeline we use an internal resolution of 2 MP on iOS and 0.5 MP on Android. The Android version generally also runs at a much lower frame rate. Certain features may also be unavailable on Android. What the end-user will notice is that it just doesn't feel as smooth on Android as it does on iOS.
Android is generations ahead with hardware, on par with CPU/GPU, better NPU
That's hilarious.
This is the fastest Android phone according to Geekbench, Compare to the fastest iPhone. For our specific application it's the single-core CPU and GPU performance that matters most. (any algorithm that can be parallelised runs on the GPU, the rest doesn't really benefit from more than 2 CPU cores).
Of course, the benchmarks above don't really matter, because you don't develop for the fastest phone you need to support, you develop for the slowest. Our stuff needs to be usable by the general public, due to the nature of what we make we need support basically any phone that has any significant real world use. In practice this means that on the iOS side our stuff needs to run with decent performance on iPhone 7 (released in 2016) and later. Here are the benchmark scores for the iPhone 7.
Now compare this to the Samsung Galaxy A14, more than 4 times lower single-core performance. Note that this is not the crappiest Android phone we need to support. Instead it's the most popular Android phone of 2023. The oldest and slowest iPhone we need to support is still significantly faster than the most sold Android phone of last year.
The nice thing about iPhones is that every iPhone sold is a high-end device, performance wise. Most Android phones sold are low to mid-range devices. While there is still a significant performance gap between high-end Android and high-end iPhone, the performance gap between what you actually need to support on iOS and Android is enormous.
does not need paid apps to shoot RAW video or photo
Neither does iPhone: Settings -> Camera -> Formats -> Apple ProRAW (photo), Apple ProRes (video).
get current ISO value in Open Camera and plenty apps that can utilise Camera2 or CameraX APIs. There seems to be quite a bit of misinformation in there
Please link to the API docs that describe this API. To be specific: it needs to be able to set the camera to automatically manage exposure and then read the actual ISO values chosen by the camera while it's adjusting it's actual exposure.
Android is so far ahead in real world, it is comical. iPhone’s advantages are always on paper and never in reality, which as we know never matters.
Sounds like you have never in your life written a single iOS/Android app that requires any significant amount of processing power. I wish I lived in your fantasy world where Android is on par, let alone ahead, because it's such an enormous PITA to have to deal with this enormous performance gap. It would make my life so much easier if Android phones were just half as fast as iPhones.
In the end I really don't care what OS it runs, I just want to build nice things. It just gets frustrating sometimes when having to take into account all these low-end Android devices limits what you can accomplish.
Fragmentation is a disadvantage in this way, but it allows for a far bigger strength – democratisation of technology
How is not supporting certain hardware features ‘democratisation’ ? This is not something users would be able to make an informed decision about, or even know about. No user goes into a phone shop and asks for a phone that has a GPU with support for SIMD-permute functions, to name one thing.
Camera2 API for years that gives us manual camera controls.
This is a good example. Camera2 API is still not supported by all phones, to the point they had to come up with CameraX to give hide the mess of dealing with multiple camera APIs from developers. Even then, Camera2 API is a joke compared to what you can do with AVFoundation.
One example: on iPhone I can set up the camera to deliver full-resolution (8, 12 or 24MP depending on the model of phone) frames to my app at, at least, 30 fps. Not only can I capture full-resolution images, I can get a synchronized stream of metadata (e.g. detected faces) with these frame. In addition to processing these frames in my code, I can simultaneously write the video to a compressed video file, again in full native camera resolution, with a custom time-synchronized metadata stream.
You can’t do anything even remotely like that on Android. You can’t get a video stream at native resolution, you’re lucky if it can do 1080p. You can’t grab raw video frames and record to a movie file at the same time. You can’t get a synchronized metadata stream. You can’t write custom metadata streams to video recordings. Hell, you can’t even get the current ISO value from the camera during live capture.
This is just one example, but there are many more areas where Android is severely lacking in capabilities.
Android may offer more customization options for end-users, but for developers iOS is so much more powerful that they aren’t really in the same league.
Practically you simply can do a lot more on iOS, especially if you are doing things that require a lot of performance. The CPU is much, much faster, you have access to more RAM, the GPUs are more advanced. There is actually a GPU computer API that works on all phones instead of the mess on Android where there simply isn’t anything that is universally supported.
APIs are also a lot more powerful on iOS. Anything related to media, for example. You have so much more control and advanced APIs for things like the camera, dealing with video data, etc. There simply is no comparison. Android is a toy OS compared to iOS.
I work on some stuff that has to run on iOS and Android, computer vision related. So often when we work on new functionality it’s “no, we can’t do that on Android”, never the other way around. CPUs are slower, GPUs are slower, and worse they lack features. Fragmentation is an absolute PITA. There isn’t even a decent GPU compute solution that works on all android phones. RenderScript is too high-level and you can’t even be sure it runs on the GPU. GPU compute is not supported on all versions of OpenGL ES and only a subset of phones support either the latest version of OpenGL ES or Vulkan. It’s an absolute mess.
Even if a phone supports Vulkan, there are tons of optional features that you can’t rely on being supported.
By comparison, iOS is absolutely amazing. Almost all iPhones run an up-to-date iOS version, you can know for certain which GPU features are supported. Both GPU and CPU are also a lot more capable than on Android phones. For example: A-series SoCs have support for hardware accelerated matrix operations, on the GPU side you have things like SIMD-group operations (permute, reduction, matrix). All stuff that you can’t use on Android because it’s either not supported at all or only a handful of phones support only some of them.
As a developer: superior hardware, years ahead of Android in terms of performance. Android is way too limited in what you can do with it as a developer.
I wish they did the opposite. Release a €999 PS5 Pro with specs that justify that price.
Sales are slowing down because people are expecting a PS5 Pro.
Wat denken jullie?
Sinds wanneer kunnen LLM’s codes breken?
DVDs are still not bad if someone really wants to buy a movie. Cheaper than BluRay and with much weaker DRM. Video is very low quality in today's standards, but bitrate and autio quality is better than any streaming.
DVD bitrate is only 9.8 Mbit and uses this very inefficiently due to the use of MPEG-2 encoding. When DVD was invented we did not have the processing power in affordable hardware for better codecs. Streaming services can do at least twice that bitrate and with much, much better codecs. Audio quality is similar, streaming services actually have higher bitrate audio than most DVDs (AC-3 at 448 kbit on DVD vs ~770 kbit EAC-3 on streaming). DTS could have higher bitrates (it was either 768 kbit or 1.5Mbit) but only supported 5.1 channels.
They don’t have to give Sony or Nintendo a 30% cut on anything sold on Xbox.
The question is if it’s worth it for them. They have to design, manufacture and support that hardware, which is sold at low or maybe even negative margin. They get a percentage from 3rd party sales but with their low sales numbers it may not be worth the effort.
There is also the fact that Microsoft’s core business is software, not hardware. Even if the hardware business makes a small profit there is also a cost associated with a lack of focus in a company.
For example, AWS is super popular even though it would be a lot cheaper for a lot of companies to run their services on their own hardware. They simply don’t want to have to deal with all that because it’s not their core business. At heart, Microsoft is still a software company, it makes sense for them to focus on the software side of the game business.
This is what I expect as well, with maybe some reassurance that they have no intention of pulling Xbox Series S/X off the market. Then, when Sony announces the PS6, people will be waiting for an announcement on the next-gen xbox. And waiting, and waiting, and waiting…
They will continue selling it this generation, and quietly drop out after that.
What kind of TV do you need bro? A 60 inch with 4k is more than enough, especially when you think about how far you are gonna sit from a 60 inch TV.
You misunderstand the point of higher resolutions. The point is not to make the image sharper, the point is to make the TV bigger at the same sharpness. This also means the same viewing distance.
At the end of the CRT ear I had 28” TV, at PAL resolutions that is ~540p visible. At the end of the HD era I had a 50” TV. Note that the ratios between resolution and size are close together. Now we’re partway through the 4k era and I currently have a 77” 4k TV. By the time we move to the 8k era I expect to have something around 100”. 8k would allow me to go up to a 200” TV.
I sit just as far from my 77” TV as I sat from my 27”, my 50” or my 65”. The point of a larger TV is to have a larger field-of-view, to fill a larger part your vision. The larger the FoV the better the immersion. That’s why movie theaters have such large screens, that’s why IMAX theaters have screens that curve around you.
Don’t think of an 8k TV as sharper, think of 4k as a cropped version of 8k. You don’t want to see the same things sharper, you want to see more things. Just like when we went from square to widescreen TV’s. The wider aspect-ratio got us extra content on the side, the 4:3 version just cut of the sides of the picture. So when you go from a 50” 4k to a 100” 8k, you can see this as getting a huge additional border around the screen that would simply be cut off on a 4k screen.
Of course, content makers need to adjust their content to take into account this larger field-of-view. But that’s again a chicken/egg problem.
The endgame is to have a TV that fills your entire field-of-view, so that when you are watching a movie that is all you see. As long as you can see the walls from the corners of your eye, your TV is not big enough.
It’s a chicken/egg problem. We need 8k so we can use bigger TV’s, but those bigger TV’s need 8k content to be usable.
I remember reading somewhere that TV show writers/producers are well aware of this and it’s kind of a sport to get the most ridiculous hacking scene in.
Real hacking would be kind of boring to show, it would just be a guy staring at a screen for hours on end and occasionally typing something.
Counting starts at one, indexing starts at zero.
I think different people have different reasons for disliking it.
For me it’s the writing. Specifically: the first half does it very best to make you hate a specific character, then the the second half has you play that character. I get what the writers were trying to do. The problem I have with it that is doesn’t make for a fun game. I don’t want to play a character I hate.
The writers were so intent on making a specific point that they forgot that they were making a video game. A video game is different from e.g. a movie in that the player is a part of the story, they take on the role of the character they are playing.
For this to work there has to be some part of the character the player can identify with. When playing Ellie, the player can identify with the rage she’s feeling. For Abby, there’s nothing to identify with. She’s mad that Joel killed her father but Joel was entirely justified in killing him. Her father was a bad person and deserved to die.
It makes it very hard for me to put myself in her shoes. As a result I just didn’t enjoy playing as her and quit the game after realizing that it wasn’t just a short section but the entire second half of the game.
This is not true for everyone. Meeting new people is difficult, especially when you’re older. Add to that several (mental) health issues that mean actually going on a date would be practically impossible. Even getting past that, I wouldn’t be able to give a woman the life she deserves.
I don’t blame anyone, I wouldn’t date me either. Can I do things to improve myself? Sure, but not enough for it to matter, the real fundamental problems will remain. Why waste effort on things that give no return on investment?