Jump to content

Why is 1% & 0.1% frame rate LOWER than min. FPS?

MaximumSid

I was benchmarking some games with FRAPS and MSI afterburner, and I noticed that 'Minimum FPS' was lower than '1% time' and '0.1% time', so why is 1% low FPS LOWER than MINIMUM FPS? As if I understand correctly, it's an average of lowest 1% of FPS, so shouldn't 'minimum' be lower than '1% lows' as it is the lowest that FPS has ever dropped to?

 

noob.jpg.df672ad67b6dd90fa636fc59aaf2e072.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Min fps is the worst second that actually happend. While 0,1% and 1% are all the worst frametimes (only the average of the highest 0,1% or 1% spikes) 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, GekkePietert said:

Min fps is the worst second that actually happend. While 0,1% and 1% are all the worst frametimes (only the average of the highest 0,1% or 1% spikes) 

Okay so Min FPS is the loweest FPS that happened, but 0.1% low is the average of worst 0.1% of FRAME TIMES that happened between frames, so when converted to FPS (Like Fraps does in this screenshot), the converted 0.1% frame times can be LOWER than the MINIMUM FPS that game ran at?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×