Graphics card performance 2.0
Traditionally, graphics cards performance has been expressed in frames per second. The assumption has always been, the more fps your graphics card is capable of, the better video games will perform. But is it really that simple? Not quite so, it turns out. To really determine how a graphics card handles a video game, the average fps is not an adequate measure of performance, and you need more detail. Frametime tests provide exactly that. Read on to find out what that entails.
To understand why fps values don't tell the whole story it helps to know what frames per second really means. It's the average number of frames that a graphics card can process during a benchmark.
The human brain perceives individual images displayed in a rapid sequence as a moving image. Science hasn't quite agreed upon how many frames per second are needed for us to perceive the image as completely fluid. The general consensus is that 25-30 fps is enough. When you see a film at the cinema it runs at 24 fps, and you rarely hear complaints that people found the film to be jittery or not smooth enough (unless they just watched Blair Witch Project or off-shoots).
For a video game to feel smooth, the graphics card needs to be able to produce at least 30 fps. But unlike movies, the frame rate in games isn't constant. It will fluctuate depending on how much is going on on the screen, and how complex the graphics are. The more demanding a scene is, the slower the framerate will be. Our rule of thumb is that you want an average framerate of 60 fps for a smooth gaming experience, so that the most complex scenes will still play out in 30 fps.
The human brain perceives individual images displayed in a rapid sequence as a moving image..provided there are enough frames per second. (photo source)
The minimum number of fps is actually more important than the average frame rate, and we do frequently get the request to also add that number in addition to the average one. We have this data, but we don't publish it for a couple reasons.
The first reason is that the minimum value can potentially be determined by an external factor that has little to do with the graphics card. If, for whatever reason, the system lags for a moment due to Windows doing something in the background, that minimum fps value will be incorrect, and not representative of how the game runs on that particular graphics card. We frequently see data like an average fps of 150, with a minimum of 16 fps. That would only sound misleading.
The more important reasons is that the minimum fps value really doesn't say anything about whether a game will runs smoothly or not with that graphics card. To find that out we need to dissect the benchmark information even further, down to the individual second.