Let me just say here how much I appreciate the sex ed I got in school.
I'm talking life-size cross-section models of a human torso that you could take individual organs out of for closer inspection.
One thing we still didn't learn much about is how wildly different periods can be for different people, I very much appreciated a friend explaining this to me.
Everyone has different standards in terms of motion blur they can bear, and you need a certain framerate to achieve that standard at any given speed of motion on screen.
Pretty much all of those are correct, I'd say. but then again, I don't think the point was to make it incorrect, just to describe it in the least scientific way possible
Though sound wave detector is still too sciency for that purpose
The faster something on screen moves, the higher your framerate needs to be for a certain level of motion blur.
A 2D point and click adventure at 30fps could have comparable motion blur to a competitive shooter at 180, for example
Framerate is inversly proportial to frametimes, which is what makes it harder to notice a difference the higher you go.
From 30 to 60? That's an improvement of 16.67ms. 60 to 120 makes 8.33ms, 120 to 240 only improves by 4.17ms, and so on
Ah, something I want to add:
That's only explaining the visual aspect, but frametimes are also directly tied to latency.
Some people might notice the visual difference less than the latency benefit. That's the one topic where opinions on frame generation seem to clash the most, since the interpolated frames provide smoother motion on screen, but don't change the latency.
If ads were just ads, then sure. But now that they serve as trackers too, and are oftentimes hijacked by malware... yeah no, screw all ads.