The finding's intuitive as well, the higher the contrast the higher the frequency has to be to not detect a flicker. Humans can easily see lightnings even though their duration is about 1000 - 100us, down to 5us for single strains (though it's debatable one can actively see those). That's a theoretical FPS of 10000 - 100000 and 200000. That's obviously the worst case of a highest possible contrast in nature. Screens being comparably dim compared to lightning and sun achieve much lower absolute contrast so don't need to achieve such high frequencies to avoid flicker. That may seem ironical, but the lower the screen brightness the lower the susceptibility to flickers.
One flash time is meaningless, and completely unrelated to FPS. Sure you can see one brief flash, incredibly brief if it's incredibly bright, it's simply about getting enough photons on the sensor to be detected, but that isn't remotely related to FPS.
For there to be FPS, there must be more than one frame.
FPS is more about the interval you can detect between two flashes. In the 50-100Hz range, they will look like one flash, unless you have a motion to separate the flashes spatially. This is known as flicker fusion threshold
The paper above is looking an extreme edge case artifacts, that you won't see on a normal display. Basically it's citing the Rainbow effect on DLP displays. When you combine motion on screen, with the serial color flashing and motion of your eyes, colors get misplaced and you get color fringing on the edges.
You won't get this artifact on displays that have continuous color (like LCD/OLED/MicroLED).
Artifacts on oddball displays aren't evidence that you need 1000 Hz monitors.