I'm not joining in this thread for bragging rights. Just to clarify the differences between consumer-grade hardware and commercial-grade workstations.
The Quadro cards are specialized hardware that have multiple threading capabilities, and built-in special functions that go far beyond what the consumer-grade cards can do. In the old days, the CAD cards had built in primitives to help speed up the rendering of objects. The CAD software would call up specific vectoring information, and the card would interpret the information, and automatically draw that shape. This helped speed up the rendering of the image. I know because I used to troubleshoot graphics terminals that used the NEC720D graphics processor. This chip had the primitive set built in for circles, elipses, squares and shading. We're talking about something that was used about 25 years ago, but is relevent to this conversation.
Much later this work was done in software because the systems run faster, and today the cards are used for solid modeling instead of line rendering. The superfast pipelines, even if the card is based on an older chipset, such as the GF20x series, is far different that the commercial card. There are many additional pipelines for rendering and shading in realtime rates far faster than the consumer cards can handle.
Can these cards handle games? Sure they can, or I should say it depends. The Quadro 3700, for example plays games fine. In fact the playback on this card is not much different than on my GF470, which I have in my system. They both handle all of the complete Open/GL command set, and can handle the other functions that are available.
If the card has been specialized to handle video processing, then the pipelines and registers maybe setup differently, so rendering 3D images will happen, but at a slower rate that a general 3D graphic card.
The Xeon processors are a bit different than the i7, and again have different branching and piplelining. They have multiple instruction queues, and very high math precision not found in lower cost consumer-grade cpus. A good example is my brother's workstation he uses for 3D modeling. In one project, he borrowed my system for the modeling while he did some rendering on his workstation. I had an 8800GTS and an IDuo2 system at the time. There were missing faces and odd points that could not be adjusted on my system. When he rendered the image on the Xeon based system, the problem disappeared. He confirmed this with the software developer that this was a floating point precision issue with the processor. The workstation at the time had a single Xeon and an ancient Quadro 980, which is similar to the 8800 series NVidia consumer-grade video card.
The thing is with consumer-grade hardware, the system builder muscle the data through the system instead of letting the processor and video card do the work they really have to do to render the images. In the high-end workstations meant for this type of work, with specialized hardware so that the precision is there for accurate work as well as performance at the same time.
So having said this, I'm interested in seeing the outcome of this experiment. Who knows if the operation is super terriffic, then well perhaps we'll all have to go out and get the super high-end workstations to run our simulator at its ultimate best.
John