GeForce 6 series


The GeForce 6 series is the sixth generation of Nvidia's GeForce line of graphics processing units. Launched on April 14, 2004, the GeForce 6 family introduced PureVideo post-processing for video, SLI technology, and Shader Model 3.0 support.

GeForce 6 series features

SLI

The Scalable Link Interface allows two GeForce 6 cards of the same type to be connected in tandem. The driver software balances the workload between the cards. SLI-capability is limited to select members of the GeForce 6 family; 6500 and above. SLI is only available for cards utilizing the PCI-Express bus.

Nvidia PureVideo Technology

Nvidia PureVideo technology is the combination of a dedicated video processing core and software which decodes H.264, VC-1, WMV, and MPEG-2 videos with reduced CPU utilization.

Shader Model 3.0

Nvidia was the first to deliver Shader Model 3.0 capability in its GPUs. SM3 extends SM2 in a number of ways: standard FP32 precision, dynamic branching, increased efficiency and longer shader lengths are the main additions. Shader Model 3.0 was quickly adopted by game developers because it was quite simple to convert existing shaders coded with SM 2.0/2.0A/2.0B to version 3.0, and it offered noticeable performance improvements across the entire GeForce 6 line.

Caveats

PureVideo functionality varies by model, with some models lacking WMV9 and/or H.264 acceleration.
In addition, motherboards with some VIA and SIS chipsets and an AMD Athlon XP processor seemingly have compatibility problems with the GeForce 6600 and 6800 GPUs. Problems that have been known to arise are freezing, artifacts, reboots, and other issues that make gaming and use of 3D applications almost impossible. These problems seem to happen only on Direct3D based applications and do not affect OpenGL.

GeForce 6 series comparison

Here is how the released versions of the "GeForce 6" series family compare to Nvidia's previous flagship GPU, the GeForce FX 5950 Ultra, in addition to the comparable units of ATI's newly released for the time Radeon X800 and X850 series:
GeForce FX 5950 UltraGeForce 6200 TC-32GeForce 6600 GTGeForce 6800 UltraATI Radeon X800 XT PEATI Radeon X850 XT PE
Transistor count135 million77 million146 million222 million160 million160 million
Manufacturing process0.13 μm0.11 μm0.11 μm0.13 μm0.13 μm low-k0.13 μm low-k
Die Area ~200110156288288297
Core clock speed 475350500400520540
Number of pixel shader processors448161616
Number of pixel pipes448161616
Number of texturing units848161616
Number of vertex pipelines3*33666
Peak pixel fill rate 1.9 Gigapixel/s700 Megapixel/s2.0 Gigapixel/s6.4 Gigapixel/s8.32 Gigapixel/s8.64 Gigapixel/s
Peak texture fill rate 3.8 Gigatexel/s1.4 Gigatexel/s4.0 Gigatexel/s6.4 Gigatexel/s8.32 Gigatexel/s8.64 Gigatexel/s
Memory interface256-bit64-bit128-bit256-bit256-bit256-bit
Memory clock speed950 MHz DDR700 MHz DDR21.0 GHz / 950 MHz** GDDR31.1 GHz GDDR31.12 GHz GDDR31.18 GHz GDDR3
Peak memory bandwidth 30.45.616.0 / 14.4**35.235.8437.76

GeForce FX series has an Array-based Vertex Shader.
AGP 6600 GT variant.

GeForce 6800 series

The first family in the GeForce 6 product-line, the 6800 series catered to the high-performance gaming market. As the very first GeForce 6 model, the 16 pixel pipeline GeForce 6800 Ultra was 2 to 2.5 times faster than Nvidia's previous top-line product, packed four times the number of pixel pipelines, twice the number of texture units and added a much improved pixel-shader architecture. Yet, the 6800 Ultra was fabricated on the same 130 nanometer process node as the FX 5950, and it consumed slightly less power.
Like all of Nvidia's GPUs up until 2004, initial 6800 members were designed for the AGP bus. Nvidia added support for the PCI Express bus in later GeForce 6 products, usually by use of an AGP-PCIe bridge chip. In the case of the 6800 GT and 6800 Ultra, Nvidia developed a variant of the NV40 chip called the NV45. The NV45 shares the same die core as the NV40, but embeds an AGP-PCIe bridge on the chip's package. NV48 is a version of NV45 which supports 512MiB RAM.
The use of an AGP-PCIe bridge chip initially led to fears that natively-AGP GPUs would not be able to take advantage of the additional bandwidth offered by PCIe and would therefore be at a disadvantage relative to native PCIe chips. However, benchmarking reveals that even AGP 4× is fast enough that most contemporary games do not improve significantly in performance when switched to AGP 8×, rendering the further bandwidth increase provided by PCIe largely superfluous. Additionally, Nvidia's on-board implementations of AGP are clocked at AGP 12× or 16×, providing bandwidth comparable to PCIe for the rare situations when this bandwidth is actually necessary.
The use of a bridge chip allowed Nvidia to release a full complement of PCIe graphics cards without having to redesign them for the PCIe interface. Later, when Nvidia's GPUs were designed to use PCIe natively, the bidirectional bridge chip allowed them to be used in AGP cards.
ATI, initially a critic of the bridge chip, eventually designed a similar solution for their own cards.
Nvidia's professional Quadro line contains members drawn from the 6800 series: Quadro FX 4000 and the Quadro FX 3400, 4400 and 4400g. The 6800 series was also incorporated into laptops with the GeForce Go 6800 and Go 6800 Ultra GPUs.

PureVideo and the AGP GeForce 6800

PureVideo expanded the level of multimedia-video support from decoding of MPEG-2 video to decoding of more advanced codecs, enhanced post-processing, and limited acceleration for encoding. But perhaps ironically, the first GeForce product to offer PureVideo, the AGP GeForce 6800/GT/Ultra, failed to support all of PureVideo's advertised features.
Media player software with support for WMV-acceleration did not become available until several months after the 6800's introduction. User and web reports showed little if any difference between PureVideo enabled GeForces and non-Purevideo cards. The prolonged public silence of Nvidia, after promising updated drivers, and test benchmarks gathered by users led the user community to conclude that the WMV9 decoder component of the AGP 6800's PureVideo unit is either non-functional or intentionally disabled.
In late 2005, an update to Nvidia's website finally confirmed what had long been suspected by the user community: WMV-acceleration is not available on the AGP 6800.
Of course, today's computers are fast enough to play and decode WMV9 video and other sophisticated codecs like MPEG-4, H.264 or VP8 without hardware acceleration, thus negating the need for something like PureVideo.

GeForce 6 series general features

  • 4, 8, 12, or 16 pixel-pipeline GPU architecture
  • Up to 8x more shading performance compared to the previous generation
  • CineFX 3.0 engine - DirectX 9 Shader Model 3.0 support
  • On Chip Video processor
  • Full MPEG-2 encoding and decoding at GPU level
  • Advanced Adaptive De-Interlacing
  • DDR and GDDR-3 memory on a 256-bit wide Memory interface
  • UltraShadow II technology - 3x to 4x faster than NV35
  • High Precision Dynamic Range technology
  • 128-bit studio precision through the entire pipeline - Floating-point 32-bit color precision
  • IntelliSample 4.0 Technology - 16x Anisotropic Filtering, Rotating Grid Antialiasing and Transparency Antialiasing
  • Maximum display resolution of 2048x1536@85 Hz
  • Video Scaling and Filtering - HQ filtering techniques up to HDTV resolutions
  • Integrated TV Encoder - TV-output up to 1024x768 resolutions
  • OpenGL 2.0 Optimizations and support
  • DVC 3.0
  • Dual 400 MHz RAMDACs which support QXGA displays up to 2048x1536 @ 85 Hz
  • Dual DVI outputs on select members

    6800 chipset table

GeForce 6600 series

The GeForce 6600 was officially launched on August 12, 2004, several months after the launch of the 6800 Ultra. With half the pixel pipelines and vertex shaders of the 6800 GT, and a smaller 128-bit memory bus, the lower-performance and lower-cost 6600 is the mainstream product of the GeForce 6 series. The 6600 series retains the core rendering features of the 6800 series, including SLI. Equipped with fewer rendering units, the 6600 series processes pixel data at a slower rate than the more powerful 6800 series. However, the reduction in hardware resources, and migration to TSMC's 110 nm manufacturing process, make the 6600 both less expensive for Nvidia to manufacture and less expensive for customers to purchase.
Their 6600 series currently has three variants: the GeForce 6600LE, the 6600, and the 6600GT The 6600 GT performs quite a bit better than the GeForce FX 5950 Ultra or Radeon 9800 XT, with the 6600 GT scoring around 8000 in 3DMark03, while the GeForce FX 5950 Ultra scored around 6000, and it is also much cheaper. Notably, the 6600 GT offered identical performance to ATI's high-end X800 PRO graphics card with drivers previous December 2004, when running the popular game Doom 3. It was also about as fast as the higher-end GeForce 6800 when running games without anti-aliasing in most scenarios.
At introduction, the 6600 family was only available in PCI Express form. AGP models became available roughly a month later, through the use of Nvidia's AGP-PCIe bridge chip. A majority of the AGP GeForce 6600GTs have their memory clocked at 900 MHz, which is 100 MHz slower than the PCI-e cards, on which the memory operates at 1000 MHz. This can contribute to a performance decline when playing certain games. However, it was often possible to "overclock" the memory to its nominal frequency of 1000 MHz and there are AGP cards that use 1000 MHz by default.