Keeping up with today's GPU market has become a confusing process. On one hand the GeForce 8800 GTS now comes in two flavours, with 640MB of GDDR3 memory or 320MB. In addition, clock frequency of both memory and GPU differ depending on the manufacturer; gaining or losing the odd 50MHz.
The Palit 640MB GeForce 8800GTS is one of the slower GTS cards, with a GPU speed of 500MHz and a memory clock of 1600MHz. It is also one of the best value 8800s on the market, priced only a little higher than many 320MB variants.
This difference in memory capacity is unlikely to matter in the majority of today's DirectX 9 titles, at least when run at resolutions below 1600 x 1200. For tomorrow's games like Crysis, featuring extremely large, detailed textures, having the full 640MB will probably make a big difference.
Since that specific title is a showcase for DirectX 10, the main reason for 8800 series, it's a major incentive to own a next generation GPU and will likely become a serious benchmark for 3D performance under Vista.
The GTS chipset loses a few features from the pricier GTX. The number of unified shaders drops to 96 from 128 and the memory bus is shrunk to 320-bit. Performance takes a hit, but the raw power of the 8800 still delivers great benchmark results. This card blew the liquid-cooled Radeon X1950XTX out of the water.
The more modest power requirements of the GTS chipset mean only a single PCI-e power connector is needed. With a shorter length, the card can be squeezed into a case without needing metres of headroom. The heatsink and fan are no larger than those found on the first generation of GeForce 6800s.
While this card doesn't deliver quite the same revolutionary performance as the 8800 GTX, it's excellent value for money. The benchmarks came out around 15 per cent lower than the overclocked GTX we reviewed above. A fair trade-off considering the price tag is almost £200 less.
With few compromises on specification, the Palit Geforce 8800 GTS gives you the peace of mind that performance will remain as impressive as you would expect from Nvidia's new chipset.