They took a bunch of rich women, dressed them up in objectifying skintight suits, then flew them to space for 10 minutes in a glorified plane ride on the second richest guy's dick shaped rocket. And we're supposed to be "inspired", meanwhile women who are actual rocket scientists and astronauts are being erased and removed from NASA's web site because "woke dei" or whatever.
It's such lazy writing, but it seems like almost everything is written this way these days. Characters make the dumbest possible decisions, and refuse to talk to each other or share important information.
I think it's redundant. I wish the community didn't keep its slightly cringeworthy reddit name, but it doesn't make sense to have essentially a copy of the same thing under a new name, especially when it doesn't have a lot of posts to start with.
I'm not quite sure what you mean. If you took a picture of an orange and sampled a pixel from it, it wouldn't look like an orange (fruit) any more, but it would look orange (color). Likewise sampling a pixel from a picture of a piece of gold wouldn't look like gold (metal), but it would still be gold (color).
Gold is – it's an orangeish yellow. I think what you're meaning is that when people say "gold", they usually are referring to the material properties of the metal as well. But the actual color does have a spectral hue. Magenta on the other hand, (including shades of pink that fall under magenta) is not a spectral color, and is just how our brain interprets the combination of signals from our red and blue cones.
Gemma 3 4b is probably a good model to use. 1b if you can't run it or it's too slow.
I wouldn't rely on it for therapy though. Maybe it could be useful as a tool, but LLMs are not people, and they're not even really intelligent, which I think is necessary for therapy.
You might want to reread my comment because you're just making false claims that are already addressed about color and resolution
No I'm not. OLED has better contrast and a wider color gamut than the best CRT. And it can have high refresh rate without dropping the resolution below the already low native resolution.
4K resolution, ultra-wide aspect ratio, and extremely high framerates are simply marketing gimmicks
So anything that your current hardware can't do is a "marketing gimmick"? Okay... But at a minimum that would mean that OLED is just "unnecessarily" better. I'm not saying it has to matter to you, but the benefits of high framerate don't abruptly stop at 120fps, and 4k isn't even reaching the point of diminishing returns if you're not using a tiny 17" display.
It is not physically possible that a human could see flicker at 85Hz.
This is just not true. You may not notice it, but many people can. There's an issue with LED lightbulbs flickering at 120hz, for example.
Anyway I'm not saying you shouldn't enjoy your CRT, I think it's cool! I just don't think it's better than OLED in any tangible way.
I'll agree that early LCD screens were really bad. TN looks terrible. I think a modern IPS or VA is a better experience than CRT in some ways, (often better color, better resolution, display size, etc.) but still has major issues like poor response time and motion clarity.
CRT does have some advantages– it is good for retro games, as a lot of pixel art was designed for the slight blur that CRTs have (waterfalls in some games, for example). And they do have good motion clarity compared to sample and hold displays, but it's because they are flickery. 85Hz flicker isn't as bad as 60hz, but it's still really uncomfortable for many people. It's one reason why almost nobody uses backlight strobing on LCD monitors. Not worth the tradeoff for most.
OLED really is pretty close to perfect, though. Vibrant accurate colors with excellent motion clarity and high refresh smoothness, virtually infinity contrast...
Trinitron really was ahead of its time, but a 32" 4k 240fps P3 OLED doesn't match it, it far exceeds it.
I think you might be a bit crazy, haha. I do have some nostalgia for CRT, but OLED is far better in every single way.
Larger available size, Higher available resolution and better clarity, Higher available refresh rate, Wider color gamut and more accurate colors, Higher contrast ratio, etc.
Not to mention how flickery CRT is.
I 100% get the appeal of old tech, but it's a bit silly to say it's equivalent to modern stuff.
This sort of flickering can be really noticeable especially at low brightness, with the always-on display for example (although still nowhere near as bad as 60hz CRT flicker shudders*)
But I honestly do not believe thet you're able to see 4000+ hz flickering. If you genuinely can, I'm sure you could get a world record for that.
Think about it this way - everything moves through spacetime at the same "speed", so the faster you go through space, the slower you move through time, which is why photons experience no time.
And nothing is okay for people who are just using it for web browsing and streaming.
I want local music, and to be able to take pictures without worrying about storage, etc. so ~20GB isn't enough for me, but for some people it really is fine.
Any general purpose consumer device should probably have 64GB or more.
But I don't see the point in disallowing <32GB, as that can still be enough for using tablets for lots of uses like e-readers, smart home displays, kiosks, etc.
In practice, this just means that low end devices will stay on older versions of Android even more than they do already.
If I move something close enough to my face it appears in view twice seemingly semi-transparent
That sounds like what I experience, not just for things very close to my face, whenever my eyes are aligned to something in front or behind.
But in order to do the dominant eye test, you need to only see one image in the foreground and background simultaneously. So how does that happen unless the view from one eye is at least partially supressed?
This is one of those things that's really hard to talk about and describe, but I would love to actually understand it. Also no, I can't notice my blind spots.
Looks real to me. What makes you say it's AI?