I went off what was shown in the video. And 99th percentile frame length just means there is the occasional long frame. FPS are an average per second.40 FPS is not the lowest it got. The 99th percentile is at 33.9 ms, which is equal to 29.5 FPS.
This is sad to read. I hope reviews show something better. Considering the game is heavily CPU bottlenecked, you'd expect DX12 to be a godsend.
I went off what was shown in the video. And 99th percentile frame length just means there is the occasional long frame. FPS are an average per second.
FPS stands for (f)rames (p)er (s)econd. So yes, it does mean over the course of a second. Anyway, it's not that critical anyway, as it is a turn based game.FPS is not necessarily an average per second, only average fps is an average per second, you can also calculate FPS from single frametimes (this is how fps graphs are made).
Either way the reason I mentioned the 99th percentile frame time, was because the discussion was about FPS dips (or inversely frame time spikes), which is what the 99th percentile score is all about.
FPS stands for (f)rames (p)er (s)econd. So yes, it does mean over the course of a second. Anyway, it's not that critical anyway, as it is a turn based game.
In terms of the 99th percentile frame times. That doesn't really translate to FPS. It's a measure of how often or noticeable stuttering may be. Trying to convert that to FPS is pretty meaningless.Yes it means frames per second, but that doesn't necessarily mean that it's an average per second. For instance you can have 3 frames with frame times of 10, 15 and 30 ms. This would be equal to instantaneous FPS values of 100 FPS, 67 FPS and 33 FPS (these are the values that would be plotted in an FPS graph), and an average of 54.5 FPS.
In terms of the 99th percentile frame times. That doesn't really translate to FPS. It's a measure of how often or noticeable stuttering may be. Trying to convert that to FPS is pretty meaningless.
You are talking about a single frame within 100. This is not what we get when we measure for minimum FPS. This is what we get when using FCAT or some other frame time software. You are talking about hickups, rather than anything over a period of time. While hickups can be annoying, if they are meaningful, they aren't the same thing as having a few seconds of 30 FPS. Most won't realize it even happened if it's close to the average minimum over a second.Of course it translates to FPS, just the 99th percentile FPS in this case (remember frame times are simply the inverse of FPS).
You are talking about a single frame within 100. This is not what we get when we measure for minimum FPS. This is what we get when using FCAT or some other frame time software. You are talking about hickups, rather than anything over a period of time.
While hickups can be annoying, if they are meaningful, they aren't the same thing as having a few seconds of 30 FPS. Most won't realize it even happened if it's close to the average minimum over a second.
You aren't going to see that single frame time hickup show up on a minimum FPS chart, unless it was a long enough frame to span most of a second.
Have you setup FRAPs or similar software before? In FRAPs, 1 second is the minimum time frame to gather your FPS unless altered. In MSI Afterburner, 300 ms is the minimum time frame they give FPS. Anything that happens faster gets averaged out.I'm not talking about a single frame within 100, I'm talking about 85 frames within 8558 (the benchmark in question had run for this long at the time of the screenshot). Minimum FPS is a different thing altogether (minimum FPS is the slowest frame of the 8558 frames). You certainly don't need FCAT or similar software/hardware to measure minimum framerate, FCAT is simply a tool that gives you more accurate frame times in general, but that goes for all of the frames measured, not just the slowest one (the slowest one being the minimum FPS).
The 85 frames in question are probably more or less evenly distributed across the benchmark in question and as suck are measured over a period of time (but they are of course not averaged across said period, since that's not how percentile values work).
There's really no such thing as the average minimum, only the minimum, which is a single data point (determined by the slowest frame). By extension there is know such thing as a minimum FPS chart, since you can't meaningfully chart a single data point.
What you are talking about is essentially charting all the frame times that falls a certain amount below the average, which coincidentally is exactly what the 99th percentile metric is all about. Whether you chart them as frame times or FPS is irrelevant, since these are just inverse values.
Also a frame time hickup most certainly does not have to span most of a second to show up on an FPS chart, for instance in the benchmark in question we can clearly see multiple frame time hickups or spikes, and none of those span anywhere close to a second (the graph cuts the top off, but most likely none of the spikes are above 50 ms).
Have you setup FRAPs or similar software before? In FRAPs, 1 second is the minimum time frame to gather your FPS unless altered. In MSI Afterburner, 300 ms is the minimum time frame they give FPS. Anything that happens faster gets averaged out.
Is it possible the minimum FPS got down to those 85 frames? Yes, but it's not likely those 85 frames happened at the same time. These minimum frames you see in charts are not from a single frame in at least most cases, unless they go into their software to alter the default values.
If they did it from a single slow frame, I no for certain some of these Unigine benchmarks would show minimums of 2-4 FPS, instead of 30, because they include some massive frame hickups in some of them that span at least 1/4-1/2 of a second.
Clearly you have your own ideas, so we don't need to keep going on about it.
I just did a little testing of my own. I used Unigine Heaven for the test, as I know when the benchmark starts, there is always a nasty hickup which causes a low minimum, which does not exist through the rest of the benchmark.Yes I have used FRAPS, but clearly you haven't, otherwise you would know that FRAPS will capture individual frame times for each individual frame.
The only way the minimum value in an FPS chart is not from a single frame, is if the chart has been smoothed in some manner.
So if Unigine doesn't show the accurate minimum frame time value, then it must be because it employs smoothing of some manner.
I just did a little testing of my own. I used Unigine Heaven for the test, as I know when the benchmark starts, there is always a nasty hickup which causes a low minimum, which does not exist through the rest of the benchmark.
I was watching MSI Afterburner track the frame times to start as well to help see frame times. I turned on FRAPS benchmark, then started the Unigine Benchmark.
I saw a 55ms frame time spike (less than 20 FPS if it continued).
Unigine saw a minimum of 32.7 FPS.
FRAPS saw a minimum of 67 FPS.
It does seem that neither goes off frame times. It would also appear that the time frame they use to gather a minimum varies as well.
Is that not what I've been saying all along? FRAPS, like most benchmark tools, will take your FPS over a minimum period of time. In FRAPS case, 1 second. I'm guessing Unigine's benchmark uses 1/2 a second by the results. The minimums that you see on these graphs are a result of the same type of techniques. Often directly from FRAPS.FRAPS can produce different logs, the frametime log will give you the frame times for each individual frame, whereas the FPS log simply averages the frame times over one second (which is in effect the same as counting the number of frames over 1 second). The MinMaxAvg log is based on the FPS log.
So as I said they both work based on frame times (or if we want to be completely accurate they work based on timestamps, with the delta between timestamps being the frametime), but what they do with these frame times differ. By averaging frame times over one second you are effectively smoothing your data, which can obfuscate the existence of frame rate spikes. This is also why no one using FRAPS should ever use the FPS and MinMaxAvg functions for anything other than quick and dirty estimates.
Is that not what I've been saying all along? FRAPS, like most benchmark tools, will take your FPS over a minimum period of time. In FRAPS case, 1 second. I'm guessing Unigine's benchmark uses 1/2 a second by the results. The minimums that you see on these graphs are a result of the same type of techniques. Often directly from FRAPS.
The point is, a single frame in a second worth of frames is pretty insignificant and why we look at the minimums over a second.The whole point is that this isn't a real minimum FPS value, since it isn't the lowest value the game/benchmark ran at (which is the definition of a minimum). Furthermore FRAPS will only report this number if you use the MinMaxAvg function, but as I just explained to you, if you want to do any real analysis using FRAPS, then you should use the Frametimes function and only the Frametimes function, since anything else gives misleading results.
Either way I really don't know why we're still discussing this. At this point it should be obvious that FRAPS does in fact use frametimes (even if it doesn't report them when you use the FPS or MinMaxAvg function), and so does the Civ 6 benchmark. So all this talk about how random benchmarks like Unigine mangles and obfuscates the data is quite frankly irrelevant.
Without green glasses that you are wearing