Está en la página 1de 18

13 Years of Nvidia Graphics Cards

001

Nvidias history begins with the NV1 chip, sold by SGS-THOMSON Microelectronics as the

STG-2000. That board included a 2D card, 3D accelerator, sound card, and a port for Sega Saturn game controllersall on the same PCI board. The best-known of these cards is the famous Diamond Edge 3D, released two years after Nvidias inception. The principal problem with the NV1 was in its management of 3D: it used quadratic texture mapping (QTM) instead of the technique used currently, which is based on polygons. DirectX appeared just after the card was released, and it used polygons, so the NV1 was a failure over the long term. Among the points worth mentioning are that the memory of the graphics card could be increased on certain models (from 2 MB to 4 MB) and that many of the games optimized were ported from the Saturn, since the card used a similar architecture.

Nvidia NV1
Date released Card interface Fillrate DirectX version Memory Type Maximum memory Memory bus Maximum bandwidth Maximum resolution Video out RAMDAC September 1995 PCI 12 Mtexels/s EDO/VRAM 4 MB 64 bits 0.6 GB/s 600 x 1 200 / 15 bits 1 x VGA 170 MHz

Memory clock frequency 75 MHz

The NV2 used the same rendering method and was never completed. It was to have been used in the Dreamcast console (which replaced the Saturn), but Sega finally chose a polygon-based technology (PowerVR) and Nvidia abandoned QTM in favor of polygon-based rendering with the NV3.

002 Riva 128 And Direct3D In 1997, Nvidias move to polygon-based 3D yielded the NV3, better known under the name Riva 128. Little known fact: Riva stands for Real-time Interactive Video and Animation accelerator. Two versions of the chip existed: Riva 128 and Riva 128ZX. The difference was slight the ZX had a faster RAMDAC, 8 MB instead of 4 MB of memory, and AGP 2x support. The Riva 128 enjoyed a certain level of success because its price was attractive, despite quality that sometimes left a bit to

be desired compared to the 3Dfx products of the period. The Nvidia card offered 2D and 3D on the same card, as well as support for Direct3D. The OpenGL drivers were released only with the 128ZX, though specific Quake versions existed (though not a complete ICD).

Nvidia NV3 (Riva 128/128ZX)


Date released Card interface Fillrate Fillrate Rendering pipelines Texture units Chip clock frequency Fabrication process Number of transistors DirectX version Memory Type Maximum memory Memory bus Maximum bandwidth Video out RAMDAC April 1997 PCI/AGP 1x March 1998 PCI/AGP 2x

100 Mtexels/s 100 Mtexels/s 100 Mpixels/s 100 Mpixels/s 1 1 100 MHz 0.35 3.5 million 5 SDRAM 4 MB 128 bits 1.6 GB/s 1 x VGA 206 MHz 1 1 100 MHz 0.35 3.5 million 5 SDRAM 8 MB 100 MHz 128 bits 1.6 GB/s 1 x VGA 250 MHz

Memory clock frequency 100 MHz

The Riva 128 was popular with OEMs due to its price, which was below that of a Voodoo Graphics card and provided Direct3D performance that was nearly the same. This was one of the first AGP cards, even if the Riva 128 used the interface essentially as a faster PCI bus. Finally, and somewhat amusingly, a very well known manufacturer was a competitor of Nvidias for performance with one of its graphics cards: Intel, with its i740. Times have changed.

003 NV4: Twin Texels For The TNT In 1998, 3Dfx had a high-performance 3D card in the Voodoo2, but the card had major limitations. These included archaic memory management (separate textures), a 16 bit color ceiling, the need for a separate 2D graphics card, and PCI-only interface (in practice, though AGP models did exist). Then the Riva TNT arrived on the scene, which was a fast 3D card with a lot of memory (for the time) and built-in 2D capabilities. Except for video performance it had no MPEG2 acceleration, as ATIs cards did the TNT was a success. It was the first Nvidia card capable of applying two textures in a single pass, thus the name TNT for TwiN Texel.

Nvidia NV4 (Riva TNT)

Date released Card interface Fillrate Fillrate Rendering pipelines Texture units Chip clock frequency Fabrication process Number of transistors DirectX version Memory Type Memory Memory bus Maximum bandwidth Video out RAMDAC

1998 PCI/AGP 2x 180 Mtexels/s 180 Mpixels/s 2 2 90 MHz 0.35 7 million 6 SDRAM 16 MB 128 bits 1.75 GB/s 1 x VGA 250 MHz

Memory clock frequency 110 MHz

The TNT was a less powerful card than originally planned. Nvidia had wanted to bring out a faster card than the Voodoo2, using a 250 nm process with a clock speed of 110 MHz (200 MHz for the memory). In fact, the TNT used a 350 nm process and had a clock speed of 90 MHz, like the 3Dfx card, with the memory running at 110 MHz. 004 NV5: The First Ultra In 1999, the TNT2 made its appearance. It was close to what the TNT was originally supposed to be, and can be thought of as a die shrink of the TNT from 350 to 250 nm. This was also the first time Nvidia used the name Ultra for one of its cards. The TNT2 cards were segmented in terms of frequency. At the time, Nvidia used only two versions (far from todays bewildering assortment): TNT2 and TNT2 Ultra. The TNT2 was a powerful card for its time, easily a match for the Voodoo3 while offering more features, even though there was still no MPEG2 decoding. It was also Nvidias first AGP 4x card, even though that standard wasnt really used with the TNT2.

Nvidia NV5 (Riva TNT2, TNT2 Ultra)


Date released Card interface Fillrate Fillrate March 1999 PCI/AGP 4x March 1999 PCI/AGP 4x

250 Mtexels/s 300 Mtexels/s 250 Mpixels/s 300 Mpixels/s

Rendering pipelines Texture unit Chip clock frequency Fabrication process Number of transistors DirectX version Memory Type Memory Memory bus Maximum bandwidth Video out RAMDAC

2 2 125 MHz 0.25 15 million 6 SDRAM 32 MB 128 bits 2.4 GB/s 1 x VGA 300 MHz

2 2 150 MHz 0.25 15 million 6 SDRAM 32 MB 183 MHz 128 bits 2.9 GB/s 1 x VGA 300 MHz

Memory clock frequency 150 MHz

The NV6, which also came out in 1999, was a cut-down version of the TNT2. It was sold under the name Vanta, Vanta LT and TNT2 M64. These cards were significantly slower than the TNT2 (and the original TNT), essentially because of their lower clock frequency and 64-bit memory bus. They were very successful with OEMs, however, who used the name TNT2 as bait. 005 GeForce: The First GPU Late in 1999, Nvidia announced the GeForce 256. This was the first card to use what Nvidia called a GPU, but its major advance was really consumer hardware support for T&L (transform and lighting). This technology, already being used in Open GL and in professional 3D, performs calculations on triangles on the graphics card instead of on the CPU. The actual gain was considerable in certain cases, since the graphics card had approximately four times the power of a high-end CPU of the time (15 million triangles for the GeForce, as opposed to four million on a 550 MHz Pentium III). The card used a different architecture from the TNT2. Instead of two rendering pipelines, each equipped with a texture unit, there were four pipelines with one texture unit, which gave the GeForce more rendering power at a lower clock frequency. The GeForce 256 was also the first card to use DDR SDRAM, increasing memory bandwidth.

Nvidia NV10 (GeForce 256)


Date released Card interface Fillrate Fillrate Rendering pipelines Texture unit October 1999 PCI/AGP 4x 480 Mtexels/s 480 Mpixels/s 4 4 February 2000 PCI/AGP 4x 480 Mtexels/s 480 Mpixels/s 4 4

Chip clock frequency Fabrication process Number of transistors DirectX version Memory Type Maximum memory Memory bus Maximum bandwidth Video out RAMDAC Video playback

120 MHz 0.22 23 million 7 SDRAM 32 MB 128 bits 2.6 GB/s 1 x VGA 350 MHz

120 MHz 0.22 23 million 7 DDR 32 MB 150 MHz (x2) 128 bits 4.8 GB/s 1 x VGA 350 MHz

Memory clock frequency 166 MHz

MPEG2 semi-hardware MPEG2 semi-hardware

Nvidia moved directly from NV6 to NV10 for the GeForce 256, and the nomenclature of the following models was in steps of five, with variants for the low/high-end models. Also, the GeForce 256 was the first Nvidia card to handle MPEG2 acceleration, but only partially (Motion Compensation). Finally, this was also the first consumer card with a DVI connector (via an external chip). 006 NV15: Nvidia Improves The GeForce 256 In the year 2000, Nvidia had a fast graphics card the GeForce 256 DDR but ATI was starting to get more competitive with its Radeon, which was both faster and more efficient. Nvidia responded with a new card, the GeForce 2 GTS. Using a 180 nm fab process, the card was noticeably faster than the GeForce 256. It doubled the number of texture units from 1 to 2 per rendering pipeline, which enabled the application of eight textures in a single pass. Nvidia released several versions of the card: the GTS (GigaTexel Shader, 200/166), Pro (200/200) and Ti (250/200).

Nvidia NV15 (GeForce 2 GTS)


Date released Card interface Fillrate Fillrate Rendering pipelines Texture unit Chip clock frequency Fabrication process Number of transistors DirectX version April 2000 PCI/AGP 4x 1600 Mtexels/s 800 Mpixels/s 4 8 200 MHz 0.18 25 million 7

Memory Type Maximum memory Memory bus Maximum bandwidth Video out RAMDAC Video playback

DDR 64 MB 128 bits 5.3 GB/s 1 x VGA 350 MHz MPEG2 semi-hardware

Memory clock frequency 166 MHz (x2)

In August 2000, pending the release of the GeForce 3, Nvidia put out the NV16 (GeForce 2 Ultra). This was not a new card, rather an NV15 with higher clock frequencies: 250 MHz for the GPU and 230 MHz for the memory, compared to 200 and 166 MHz on the original card. This was also one of the most expensive cards ever produced by Nvidia. 007 NV11: The First Low-End Version The GeForce 2 GTS had great performance, but also a high price tag, and Nvidia needed to offer a card for gaming enthusiasts who couldnt afford to spend a small fortune on a computer. The companys answer was the NV11, the GeForce 2 MX, also released in 2000. Unlike the TNT2 M64 and Vanta, which in reality were nothing more than an NV5 with a 64-bit memory bus, the NV11 had a new architecture derived from the GeForce 2 GTS. Nvidia did away with part of the rendering pipelines, but for multitexturing a GeForce 2 MX had more power than a GeForce 256. This was the first Nvidia card that could manage more than one display, and that function was to remain part of Nvidias midrange cards for a few years. The GeForce 2 MX had only SDR memory and was also the first GeForce to be released in a mobile version (the GeForce 2 Go).

Nvidia NV11 (GeForce 2 MX)


Date released Card interface Fillrate Fillrate Rendering pipelines Texture units Chip clock frequency Fabrication process Number of transistors DirectX version Memory Type Maximum memory June 2000 PCI/AGP 4x 700 Mtexels/s 350 Mpixels/s 2 4 175 MHz 0.18 19 million 7 SDRAM 64 MB

Memory clock frequency 166 MHz Memory bus Maximum bandwidth Video out RAMDAC Video playback 128 bits 2.6 GB/s 2 x VGA/DVI 350 MHz MPEG2 semi-hardware

Nvidia brought out several versions of the GeForce 2 MX in addition to the standard model and the Go version. These included the MX400 (equipped with a GPU clocked at 200 MHz), the MX200 (with a 175 MHz GPU and 64-bit memory bus at 166 MHz) and the very poor MX100, with a GPU clocked at only 143 MHz and 32-bit memory (0.6 GB/s bandwidth). Finally, some rare cards were equipped with 64-bit DDR and were basically equivalent to the 128-bit SDR versions. 008 Enter The GeForce 3 In 2001, the GeForce 3 made its appearance. This card, the first to be DirectX 8 compatible, supported programmable pixel shaders. With 57 million transistors, the card used fairly conservative clock speeds and a GeForce 2 Ultra could outperform it in many cases (at the time it was released). The card brought a few improvements in memory management, but its complex architecture prevented Nvidia from developing an entry-level version.

Nvidia NV20 (GeForce 3)


Date released Card interface Fillrate Fillrate Rendering pipelines Texture unit Vertex Shader units Pixel Shader version Chip clock frequency Fabrication process Number of transistors DirectX version Memory Type Maximum memory Memory bus Maximum bandwidth March 2001 PCI/AGP 4x 2000 Mtexels/s 1000 Mpixels/s 4 8 1 1.1 250 MHz 0.15 57 million 8 DDR 64 MB 128 bits 7.4 GB/s

Memory clock frequency 230 MHz (x2)

Video out RAMDAC Video playback

1 x VGA 350 MHz semi-hardware

Nvidia offered two different versions of the GeForce 3: the Ti 200, which was a little less expensive than the original, and the Ti 500, which was more expensive. The former was clocked at 175/200 (GPU/memory) and the latter at 240/250 MHz. 009 The GeForce 4 That Was A GeForce 2 Moving ahead to 2002, Nvidia had a card with performance in the GeForce 3, but it was too complex. Creating a new card based on its architecture (as had been done with the NV11) was a difficult proposition, and so Nvidia used the architecture of the GeForce 2 to create the NV17, marketed as the GeForce 4 MX. The cards used the same architecture as the GeForce 2 MX two pipelines capable of rendering two textures but ran at higher clock rates. The cards also used the memory management introduced with the GeForce 3, had hardware MPEG2 decoding, and supported multiple displays. Still, they were DirectX 7 cards, and so were outdated from the time they were launched, despite adequate performance in some cases. The line included three cards: the MX420, MX440 and MX460. The first was clocked at 250 MHz for the GPU and 166 MHz (SDR) for the memory; the second ran at 275/200 (DDR), and the third at 300/275 (DDR).

Nvidia NV17 (GeForce 4 MX 440)


Date released Card interface Fillrate Fillrate Rendering pipelines Texture units Chip clock frequency Fabrication process Number of transistors DirectX version Memory Type Maximum memory Memory bus Maximum bandwidth Video out RAMDAC February 2002 PCI/AGP 4x 1100 Mtexels/s 550 Mpixels/s 2 4 275 MHz 0.15 27 million 7 DDR 128 MB 128 bits 6.4 GB/s 2 x VGA/DVI 350 MHz

Memory clock frequency 200 MHz (x2)

Video playback

MPEG2 hardware

In addition to the 420, 440 and 460 versions, Nvidia offered mobile versions (GeForce 4 Go), AGP 8x versions (with the NV18 chip, the only improvement), and even a PCI Express version in 2003: the PCX4300, with an AGP 8x-to-PCI Express 16x bridge. 010 NV2A: A GeForce In A Console In 2001, Microsoft introduced its first game console, the Xbox. It was very close to a PC in terms of architecture. It used an x86 processor and ran Windows and the graphics card was from Nvidia. The NV2A, as it was called, is an intermediate chip between the GeForce 3 and GeForce 4. It was well-optimized in the Xbox and supported DirectX 8.1 (through the consoles NT5 kernel), enabling the console to offer some very graphically impressive games for its time.

Nvidia NV2A (Xbox)


Date released Card interface Fillrate Fillrate Rendering pipelines Texture units Vertex units Chip clock frequency Fabrication process Number of transistors DirectX version Memory Type Maximum memory Memory bus Maximum bandwidth November 2001 N/A 1864 Mtexels/s 932 Mpixels/s 4 8 2 233 MHz 0.15 63 million 8.1 DDR 64 MB 128 bits 6.4 GB/s

Memory clock frequency 200 MHz (x2)

For the Xbox 360, ATI supplied the GPU and Nvidia went over to the enemy with its RSX chip used in the PlayStation 3. 011 An Improved GeForce 3: The GeForce 4 Ti The successor to the GeForce 3, released in February 2002, was called the GeForce 4 Ti. Its architecture was similar to that of the NV20 (GeForce 3), but the NV25 was significantly faster due to its 150 nm process. Nvidia gave the GeForce 4 Ti approximately three times the Vertex Shader power of the GeForce 3 by increasing the clock frequency and doubling the number of ALUs. In addition, Nvidia improved LMA, the technology that limits memory bandwidth use by not calculating undisplayed data.

Nvidia sold three versions of the card: the Ti 4200, the Ti 4400 and the Ti 4600. The differences among the cards was in the clock speeds: 250 MHz for the GPU and 250 MHz for the memory (Ti 4200); 275/275 for the Ti 4400; and 300/325 for the high-end Ti 4600.

Nvidia NV25 (GeForce 4 Ti 4600)


Date released Card interface Fillrate Fillrate Rendering pipelines Texture units Vertex Shader units Version Shader Chip clock frequency Fabrication process Number of transistors DirectX version Memory Type Maximum memory Memory bus Maximum bandwidth Video out RAMDAC Video playback February 2002 PCI/AGP 4x 2400 Mtexels/s 1200 Mpixels/s 4 8 2 1.3 300 MHz 0.15 63 million 8 DDR 128 MB 128 bits 10.4 GB/s 2 x VGA 350 MHz MPEG2 semi-hardware

Memory clock frequency 325 MHz (x2)

Late in 2002, the NV28 arrived. This GPU was similar to the NV25, simply adding AGP 8x support to the GeForce 4 Ti cards. The GeForce Ti 4800 (300/325) was identical to the GeForce 4 Ti 4600 except for the addition of AGP 8x compatibility. The GeForce Ti 4200 128 MB had a lower bandwidth than the 64 MB version because the memory ran at 222 MHz compared to 250 MHz in the 64 MB version. 012 NV30: Nvidia Loses With The FX 5800 In January 2003, Nvidia released the GeForce FX 5800 (NV30). This card was criticized both for its performance, which was unworthy of a high-end card, and its high noise level. Released at around the same time, ATIs Radeon 9700 Pro was much more efficient and also faster. The NV30 was a commercial failure, even if Nvidia sometimes says that the failure is one of the best things that have happened to the company since it proved that you can never rest on your laurels.

Nvidia NV30 (GeForce FX 5800)


Date released Card interface Fillrate (Mtexels) Fillrate (Mpixels) Rendering pipelines Texture units Vertex Shader units Pixel Shader version Chip clock frequency Fabrication process Number of transistors DirectX version Memory Type Memory (generally) Memory bus Maximum bandwidth Video out RAMDAC Video playback January 2003 PCI/AGP 8x 3200 Mtexels/s 1600 Mpixels/s 4 8 2 2.0a 400 MHz 0.13 125 million 9 DDR2 128 MB 128 bits 12.8 GB/s 2 x VGA 400 MHz MPEG2 hardware

Memory clock frequency 400 MHz (x2)

The Ultra version of the card was faster (or shall we say less slow), with a clock speed of 500 MHz for the GPU and memory (DDR2). 013 NV3x: Nvidia Releases FX (and PCX) Versions Even after the failure of the NV30, Nvidia kept the architecture, with the GeForce FX 5900 replacing the GeForce FX 5800. With its 256-bit memory bus and improved vertex calculating power, the FX 5900 managed to hold its own against competing cards like the Radeon 9800 Pro. Nvidia also released entry-level and midrange versions of its GeForce FX: the FX5600 (NV31) and FX5700 (NV36) in the midrange, and the entry-level FX5200 (NV34). These cards are noteworthy in that the earlier midrange card (the GeForce 4 Ti 4200) could outperform them.

Nvidia NV3x
Name of the card Date released Card interface NV35 (FX 5900) NV31 (FX 5600) NV36 (FX 5700) NV34 (FX 5200) May 2003 PCI/AGP 8x March 2003 PCI/AGP 8x October 2003 PCI/AGP 8x March 2003 PCI/AGP 8x

Fillrate (Mtexels) Fillrate (Mpixels) Rendering pipelines Texture units Vertex Shader units Fabrication process DirectX version Pixel Shader version Memory Type Memory (generally) Memory clock frequency Memory bus Video out RAMDAC Video playback

3200 Mtexels/s 4 8 3 0.13 9 2.0a DDR 256 MB 425 MHz (x2) 256 bits 2 x VGA 400 MHz MPEG2 hardware

1300 Mtexels/s 4 4 1 325 MHz 0.13 80 million 9 2.0a DDR 128 MB 275 MHz (x2) 128 bits 8.8 GB/s 2 x VGA 350 MHz MPEG2 hardware

1700 Mtexels/s 4 4 3 425 MHz 0.13 82 million 9 2.0a DDR 256 MB 250 MHz (x2) 128 bits 8 GB/s 2 x VGA 350 MHz MPEG2 hardware

1000 Mtexels/s 4 4 1 250 MHz 0.13 47 million 9 2.0a DDR 128 MB 200 MHz (x2) 128 bits 6.4 GB/s 2 x VGA 350 MHz MPEG2 hardware

1600 Mpixels/s 1300 Mpixels/s 1700 Mpixels/s 1000 Mpixels/s

Chip clock frequency 400 MHz Number of transistors 130 million

Maximum bandwidth 27.2 GB/s

Nvidia also released PCI Express cards the GeForce PCX series but they were essentially AGP cards with an AGP-to-PCI Express bridge. Some FX 5200 cards had a 64-bit bus (instead of 128bit) and a slower memory clock frequency (166 MHz instead of 200 MHz). 014N40/N45: Nvidia Gets Back In The Race With The GeForce 6800 and SLI After the failure of the NV30, it was imperative of Nvidia to snap back. And they did, with the NV40, also known as the GeForce 6800. This card was extremely efficient and more powerful than the FX 5900, due to its large number of transistors (222 million). The NV45, also called GeForce 6800, was nothing more than an NV40 with an AGP-to-PCI Express bridge, giving the card support for the new standard, and above all, for SLI. The SLI technology couples two PCI Express GeForce 6 cards to increase performance.

Nvidia NV40 and NV45 (GeForce 6800 Ultra)


Date released Card interface Fillrate (Mtexels) Fillrate (Mpixels) April 2004 AGP 8x 6400 Mtexels/s 6400 Mpixels/s March 2005 PCI Express 16x 6400 Mtexels/s 6400 Mpixels/s

Rendering pipelines Texture units Vertex Shader units Chip clock frequency Fabrication process Number of transistors DirectX version Pixel Shader Version Memory Type Memory (generally) Memory bus Maximum bandwidth Video out RAMDAC Video playback Multi-GPU support

16 16 6 400 MHz 0.13 222 million 9c 3.0 GDDR3 256 MB 256 bits 35.2 GB/s 2 x VGA 400 MHz N/A

16 16 6 400 MHz 0.13 222 million 9c 3.0 GDDR3 256 MB 550 MHz (x2) 256 bits 35.2 GB/s 2 x VGA 400 MHz 2

Memory clock frequency 550 MHz (x2)

MPEG2 hardware MPEG2 hardware

Cards based on the NV41 and NV42 were also produced. The NV41 is an NV40 with fewer processing units (12 pipelines and 5 vertex units) used in certain GeForce 6800 cards; the NV42 is an NV41 fabricated with a 110 nm process (and thus, less expensive to produce). 015 GeForce 6 Invades The Planet After the GeForce 6800, Nvidia needed to introduce cards that were slower and less expensive. The NV40 was powerful, but its 222 million transistors limited fabrication yields and increased the price, so the two cards built from it, the GeForce 6600 and 6200, had only moderate success. The GeForce 6600, fabricated at 110 nm, was based on the NV43 and offered good performance at a decent price. The PCI Express versions of these cards could even operate in SLI mode. The GeForce 6600 was the first natively PCI Express Nvidia card; AGP versions used a PCI Expressto-AGP bridge. The GeForce 6200 was an entry-level card not very powerful but not very expensive. PCI Express, AGP, and PCI versions were produced, and there were also versions built into laptops.

Nvidia NV43 and NV44 (GeForce 6600 GT and GeForce 6200)


Date released Card interface Fillrate (Mtexels) Fillrate (Mpixels) August 2004 PCI Express 16x 4000 Mtexels/s 2000 Mpixels/s August 2004 PCI Express 16x 1400 Mtexels/s 700 Mpixels/s

Rendering pipelines Texture units Vertex shader units Chip clock frequency Fabrication process Number of transistors DirectX version Pixel Shader version Memory Type Memory (generally) Memory clock frequency Memory bus Maximum bandwidth Video out RAMDAC Video playback Multi-GPU support

4 8 3 500 MHz 0.11 143 million 9c 3.0 GDDR3 128 MB 450 MHz (x2) 128 bits 14.2 GB/s 2 x VGA 400 MHz MPEG2 hardware 2

2 4 3 350 MHz 0.11 77 million 9c 3.0 GDDR3 64 MB 350 MHz (x2) 64 bits 5.6 GB/s 2 x VGA 400 MHz MPEG2 hardware N/A

The GeForce 6200 was the first TurboCache card from Nvidia. In addition to the dedicated memory (16 to 512 MB), the card can use system RAM as video memory. Some manufacturers took advantage of this to tout the GeForce 6200 as 256 MB, when in fact it had only 64 MB of dedicated memory. Note also that a built-in version of the NV44, the GeForce 6100, was included in certain Nvidia chipsets. The chip used a 90 nm process and had a single rendering pipeline and no dedicated memory. 016 G70 and G71: Nvidia Changes Its Nomenclature In 2005, Nvidia announced the GeForce 7. The GPUs code name, which had traditionally been NVxx, changed to Gxx. The first card was the G70 (GeForce 7800), followed fairly quickly by the G71 (GeForce 7900). More powerful than the 6800 series, the GeForce 7800 was a success for Nvidia. The cards were sold in many different versions, such as the GTX and GS. AGP versions with a PCI Express-to-AGP bridge were also sold.

Nvidia G70 and G71 (GeForce 7800 GTX and 7900 GTX)
Date released Card interface Fillrate (Mtexels) Fillrate (Mpixels) Rendering pipelines June 2005 PCI Express 16x 13200 Mtexels/s 8800 Mpixels/s 16 March 2006 PCI Express 16x 15600 Mtexels/s 10400 Mpixels/s 16

Texture units Vertex units Fabrication process DirectX version Memory Type Memory (generally) Memory clock frequency Memory bus Video out RAMDAC Video playback Multi-GPU support

24 8 0.11 9c GDDR3 512 MB 850 MHz (x2) 256 bits 2 x VGA 400 MHz MPEG2 hardware, WMV9 semihardware 2

24 8 650 MHz 0.09 278 million 9c 3.0 GDDR3 512 MB 800 MHz (x2) 256 bits 51.2 GB/s 2 x VGA 400 MHz MPEG2 hardware, WMV9 semihardware 4 (2x2)

Chip clock frequency 550 MHz Number of transistors 302 million Pixel Shader version 3.0

Maximum bandwidth 54.4 GB/s

With the GeForce 7900 Nvidia also used, for the first time, a technique its competitors had already been using: dual-GPU cards. The 7900GX2 and 7950GX2 had two G71s in parallel. The company was to re-use this technique in 2008 with the GeForce 9800GX2. 017 G72 and G73: Low-end GeForce 7s As has become its habit, Nvidia released two other versions of its high-end architecture one entry-level (G72, GeForce 7300) and one midrange (G73, GeForce 7600). Both chips were fabricated with a 90 nm process and offered adequate performance. As is often the case, the mobile versions used the midrange chips, and the GeForce 7300 Go was very popular.

Nvidia G72 and G73 (GeForce 7300 GS and 7600 GT)


Date released Card interface Fillrate (Mtexels) Fillrate (Mpixels) Rendering pipelines Texture units Vertex Shader units January 2006 PCI Express 16x 2200 Mtexels/s 1100 Mpixels/s 2 4 3 March 2006 PCI Express 16x 6720 Mtexels/s 4480 Mpixels/s 8 12 5 560 MHz

Chip clock frequency 550 MHz

Fabrication process DirectX version Memory Type Memory (generally) Memory clock frequency Memory bus Video out RAMDAC Video playback Multi-GPU support

0.09 9c GDDR 128 MB 400 MHz (x2) 64 bits 2 x VGA 400 MHz MPEG2 hardware, WMV9 semihardware N/A

0.09 177 million 9c 3.0 GDDR3 256 MB 700 MHz (x2) 128 bits 22.4 GB/s 2 x VGA + 2 x TDMS 400 MHz MPEG2 hardware, WMV9 semihardware 2

Number of transistors 112 million Pixel Shader version 3.0

Maximum bandwidth 6.4 GB/s

Slower (7200 Go) and faster (7400 Go) portable versions were also produced, and an 80 nm version of the G73 was also sold by Nvidia. 018 Nvidia And The 8800: GeForce 8 Or GeForce 9? In November 2006, Nvidia announced the G80. This chip and its derivatives were destined to have a long life. In fact, as of 2008, some of the fastest cards available from NVIDIA were still using a chip thats very close to this G80 (the G92). Nvidia got as much mileage as possible out of the G80 and the move to a 65 nm process with the G92 allowed the company to save money on the cost of the chip. Nvidia varied the number of stream processors, the width of the memory bus, and clock speeds, in order to produce a plethora of GeForce 8800 and 9800 versions. Theres even a version with 2 GPUs: the GeForce 9800GX2. The GeForce 8800 series cards were all DirectX 10 compatible, and Nvidia scored a great success with this series, pending the arrival of the GeForce GTX.

Nvidia G80 and G92 (GeForce 8800 GTX and 9800 GTX)
Date released Card interface Fillrate (Mtexels) Fillrate (Mpixels) Rendering pipelines Texture units Stream Processors November 2006 PCI Express 16x 18400 Mtexels/s 13800 Mpixels/s 24 32 128 April 2008 PCI Express 16x (2.0) 43875 Mtexels/s 10800 Mpixels/s 16 64 128

Chip clock frequency 575 MHz Fabrication process DirectX version Pixel Shader version Memory Type Memory (generally) Memory clock frequency Memory bus Video out RAMDAC Video playback Multi-GPU support 0.09 10 4.0 GDDR3 768 MB 900 MHz (x2) 384 bits NVIO 400 MHz MPEG2 hardware, WMV9 semihardware 3 Number of transistors 681 million

675 MHz 0.065 754 million 10 4.0 GDDR3 512 MB 1100 MHz (x2) 256 bits 70.4 GB/s 2 x TDMS (DualLink), HDCP 400 MHz MPEG2 hardware, H.264 hardware 3

Maximum bandwidth 86.4 GB/s

Just for a laugh, lets run through all the GeForce 8800 series cards that have been released: the 8800GS 374, 8800GS 768, 8800GTS 320, 8800GTS 640, 8800GTS 640 v2, 8800GTS 512, 8800GT 256, 8800GT 512, 8800GT 1024, 8800GTX 768 and 8800 Ultra 768. Then theres the 9600GSO 512, 9600GSO 384 and 9600GSO 768, and the 9800GX2 and 9800GTX not to mention the future 9800GTS and 9800GT. And thats not counting the mobile versions! 019 Entry-Level GeForce 8s To be able to market economy versions of the card, Nvidia had to severely modify the G80. Given the number of transistors, it was out of the question to use it as-is. So the company offered three chips, more or less: the GeForce 8400 (G86), GeForce 8600 (G84) and GeForce 9600 (G94). Other versions existed (GeForce 8300, 8500, and so on), but those three models are the major ones. The G84 was much used in notebooks, as a high-end card, whereas in desktop PCs it was only a midrange GPU.

Nvidia G84, G86 and G94 (GeForce 8600 GT, GeForce 8400 GS and 9600 GT)
Date released Card interface April 2007 PCI Express 16x June 2007 PCI Express 16x 8640 Mtexels/s 4320 Mpixels/s 8 16 February 2008 PCI Express 16x (2.0) 20800 Mtexels/s 10400 Mpixels/s 16 32

Fillrate (Mtexels) 3600 Mtexels/s Fillrate (Mpixels) 1800 Mpixels/s Rendering pipelines Texture units 4 8

Stream Processors 16 Chip clock frequency Fabrication process Number of transistors DirectX version Pixel shader version Memory Type Memory (generally) Memory clock frequency Memory bus Maximum bandwidth Video out RAMDAC Video playback Multi-GPU support 450 MHz 0.08 210 million 10 4.0 DDR2 256 MB 400 MHz (x2) 64 bits 6.4 GB/s

32 540 MHz 0.08 289 million 10 4.0 GDDR3 256 MB 700 MHz (x2) 128 bits 22.4 GB/s

64 650 MHz 0.065 505 million 10 4.0 GDDR3 512 MB 900 MHz (x2) 256 bits 57.6 GB/s

2 x TDMS (DualLink), 2 x TDMS (DualLink), 2 x TDMS (DualLink), HDCP HDCP HDCP 400 MHz MPEG2 hardware, H.264 hardware N/A 400 MHz MPEG2 hardware, H.264 hardware 2 400 MHz MPEG2 hardware, H.264 hardware 2

The GeForce 8600 and GeForce 8400 were as mediocre as the G80 and GeForce 8800 were successful. The spread between high-end and midrange cards (before the arrival of the GeForce 9600) is very wide for this generation, which causes problems for gamers. 020

También podría gustarte