Why would anyone want or need more than 40 FPS.
This might shed some light on why -
http://joz3d.net/html/fps.html
Movies
I have seen film students write in to columns about how anything over 24 fps is wasted. Why 24 fps? Movies in theaters run at 24 fps. They seem pretty smooth to me, so why would we need more? Well, let's take a look at movies from the eyes' perspective.
First off, you are sitting in a dark movie theater and the projector is flashing a really bright light on a highly reflective screen. What does this do? Have you ever had a doctor flash a bright light in your eye to look at your retina? Most of us have. What happens? A thing called "afterimage". When the doctor turns off the bright light, you see an afterimage of the light (and it is not real comfortable). Movie theaters do the same thing. The light reflected off the screen is much brighter than the theater surroundings. You get an afterimage of the screen after the frame is passed on, so the next frame change is not as noticable.
Screen refresh is also a very important factor in this equation. Unlike a television or a computer monitor, the movie theater screen is refreshed all at once (the entire frame is instantly projected and not drawn line for line horizontally as in a TV or monitor). So every frame is projected in its entirety all at once. This then leads back to afterimage due to the large neurotransmitter release in the retina.
Perhaps the most important factor in the theater is the artifact known as "motion blur". Motion blur is the main reason why movies can be shown at 24 fps, therefore saving Hollywood money by not having to make the film any longer than possible (30 fps for a full feature film would be approximately 20% longer than a film shown at 24 fps, that turns out to be a lot of money). What motion blur does is give the impression of more intervening frames between the two actual frames. If you stop a movie during a high action scene with lots of movement, the scene that you will see will have a lot of blur, and any person or thing will be almost unrecognizable with highly blury detail. When it is played at full 24 fps, things again look good and sharp. The human eye is used to motion blur (later on that phenomena) so the movie looks fine and sharp.
TV, Video Tape, and DVD
TV's run at a refresh rate of 60 Hz. This is not bad for viewing due to the distance we usually sit from the TV, and the size of the phosphors on your average set and the distance between phosphors (between .39 for a high end one, to .5 and higher for cheaper models). This is actually quite big and fuzzy for most of us, but as long as we are not doing any kind of productivity software (such as word processing) and just watching movies at least 6 feet from the TV, that is just fine.
Now TV transmissions, video tape, and DVD play at 30 fps. The increase from movies is due mostly to the environment that the TV is watched in. It is usually quite a bit brighter than in a movie theater, and most importantly a TV does not do a full screen refresh, rather each frame is drawn line by line horizontally by an electron gun hitting the phosphors in the screen. So basically each frame is drawn twice by the TV (60 refreshes per second, 30 frames per second). Now because the frame rate is 1/2 the refresh, transitions between frames go a lot smoother than if you had say a 72 Hz refresh and a movie playing at 30 fps. Don't ask me why, it is due to wave behavior, which is higher level physics, and I can't go into that without making this a 30 page paper. Needless to say, the physics behind this make video and DVD look very smooth.
Motion blur again is a very important part to making videos look seamless. With motion blur, those two refreshes per frame give the impression of two frames to our eyes. This makes a really well encoded DVD look absolutely incredible. Another factor to consider is that neither movies or videos dip in frame rate when it comes to complex scenes. With no frame rate drops, the action is again seamless.
Games on the Computer
This is the second toughest part of this article. TV and Movies are easy to understand, and the technology behind it is also easy to understand. Computers and the way games are projected to us is a lot more complex (the most complex is the actual physiology / neuro-ethology of the visual system).
First off, the hardware used for visualization (namely the monitor) is a very fine piece of equipment. It has a very small dot pitch (distance between phosphors) and the phosphors themselves are very fine, so we can get exquisite detail. We set the refresh rates at over 72 Hz for comfort (flicker free). This makes a very nice canvas to display information on, unfortunately because it is so fine it can greatly magnify flaws in the output of a video card. We will get into refresh in the section on the human eye.
Let us start with how a scene or frame is set up by the computer. Each frame is put together in the frame buffer of the video card and is then sent out through the RAMDAC to the monitor. That part is very easy, nothing complex there (except the actual setup of the frame). Now each frame is perfectly rendered and sent to the monitor. It looks good on the screen, but there is something missing when that action gets fast. So far, programmers have been unable to make motion blur in these scenes. When a game runs at 30 fps, you are getting 30 perfectly rendered scenes. This does not fool the eye one bit. There is no motion blur, so the transition from frame to frame is not as smooth as in movies. 3dfx put out a demo that runs half the screen at 30 fps, and the other half at 60 fps. There is a definite difference between the two scenes, with the 60 fps looking much better and smoother than the 30 fps.
The lack of motion blur with current rendering techniques is a huge setback for smooth playback. Even if you could put motion blur into games, it really is not a good idea whatsoever. We live in an analog world, and in doing so, we receive information continuously. We do not perceive the world through frames. In games, motion blur would cause the game to behave erratically. An example would be playing a game like Quake II, if there was motion blur used, there would be problems calculating the exact position of an object, so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned. So we have perfectly drawn frames, so objects are always able to be calculated in set places in space. So how do you simulate motion blur in a video game? Easy, have games go at over 60 fps! Why? Read the section on the human eye.
Variations in frame rate also contribute to games looking jerky. In any game, there is an average frame rate. Rates can be as high as the refresh rate of your monitor (70+), or it can go down in the 20's to 30's. This can really affect the visual quality of the game, and in fast moving ones can actually be detrimental to your gameplaying performance. One of the great ideas that came from the now defunct Talisman project at Microsoft was the ability to lock frame rates (so the rate goes neither above or below a certain framerate). In the next series of graphics cards, we may see this go into effect.
The Human Eye (and Visual Cortex)
Contrary to the belief that we cannot distinguish anything over 30 fps, we can actually see and recognize speeds up to 70+ fps. How can you test this? You can quickly do this with your monitor at home. Set the refresh rate to 60 Hz and stare at it for a while. You can actually see the refreshes and it is very tiring to your eyes. Now if we couldn't see more than 30 fps, why is it that flicker free is considered to be 72 Hz (refreshes per second). You can really tell if the refresh is below 72 by turning your head and looking at the screen through your peripheral vision. You can definitely see the screen refreshes then (due to rods being much more efficient and fast).