I have no idea how to measure the fps of an application. Even the measuring mechanism will depend on what graphic library the application uses (OpenGL vs DirectX vs software rendering). The only application I know that does this kind of thing is fraps (
http://www.fraps.com/) so it is possible.
Refresh rate is not the same as fps. Refresh rate is used for the speed the monitor can update the display. For CRT monitors this is linked to the vertical frequency, and complicated by interlaced/not interlaced matters (although I think the only CRTs still around that use interlacing is a TV
). LCD/TFT is a completely different matter, manufacturers also mention a refresh rate for them, but I think that is more based on the time it takes to change the color of a pixel as it doesnt have an electron beam to move around at high speed.
The FPS of an application however, is how many frames the application can calculate. This can be lower than the refresh rate (if FPS is very low, you get a slideshow effect
) or it can be faster than the refresh rate (possibly resulting in tearing of an image if the image the monitor is displaying, is updated halfway). To make it more fun, many applications that do 3d rendering have a 'lock vsync' function that limits the FPS to the refresh rate.
So the values might be equal, but refresh rate is hardware limited and can't be changed without changing hardware. The value can easily be looked up in the hardware specification of the monitor. FPS is a software issue that is dependent on the pc rendering, faster software results in more FPS. This can probably be only measured at the graphic driver level, but I have no idea how.