Ati Radeon 2600 Xt

0 views
Skip to first unread message

Temika

unread,
Aug 4, 2024, 1:27:03 PM8/4/24
to stilgeichondman
HiI have this AGP excellent card, and I thought it was optimal for a W98Build but no Win95/98 drivers AFAIK !!!!

Anyone pointing me to the right direciont? Some modified drivers at least?

Thanks to this impressive comunity.


They can be in the same "league" performance wise, but no driver support, kind of kills the whole Idea of being able to run these cards in win98 does it not?

And a r420/430 architecture on 110/130nm is nothing alike a RV620 on a 65nm tech. Let alone the core config etc... it needs specific drivers.


For P3 motherboards like you listed, you should probably avoid "bridged" cards (which use a translator chip to make PCI Express GPUs communicate with an AGP slot). Bridged cards have hardware compatibility problems with P3 era chipsets (not sure about the 440BX but it definitely applies to VIA).

So even if you were to install XP to get the HD2600 working, you'd probably run into artifacts when using it in 3D with those motherboards.


If I remember correctly, the X300 and X600 were both just new labels slapped on old 9000-series cards, and only the X800 was actually a new architecture, so I'd expect catalyst 6.2 to work on Radeon X300 and X600 cards but not the X800


Tom's Hardware is roughly categorizing performance, and that is not an indicator of compatibility. Old high end cards are designed differently from newer midrange / budget cards. And they are intended to run different software. Maybe an old iPhone is as fast as a Pentium 4... but the P4 runs Windows 98 and the iPhone doesn't, right? ?


The X850 was launched as ATI's flagship part in 2004 and is the last ATI video card with [beta] driver support for Windows 9x. This is noted in the Catalyst 6.2 release notes. The 2600 was a low-mid product launched in 2007. Windows XP came out in 2001 and Windows Vista was out around the end of 2006.


I happen to have a Radeon X850 and I've tried using it Windows ME (it works). However it's kind of ridiculously / needlessly fast... I can get hundreds of frames per second in most Windows 98 era games by using a mere Radeon 9800 Pro (1600x1200 8xAA). IMO anything faster than a Radeon 9700 is better suited to Windows XP. Especially if you're using a Pentium III on a BX board (great combination for Windows 98!), you won't loose any performance by choosing an older video card.


The Unified Video Decoder (UVD) SIP core is on-die in the HD 2400 and the HD 2600. The HD 2900 GPU dies do not have a UVD core, as its stream processors were powerful enough to handle most of the steps of video acceleration in its stead except for entropy decoding and bitstream processing which are left for the CPU to perform.[4]


HDTV encoding support is implemented via the integrated AMD Xilleon encoder; the companion Rage Theater chip used on the Radeon X1000 series was replaced with the digital Theater 200 chip, providing VIVO capabilities.


For display outputs, all variants include two dual-link TMDS transmitters, except for HD 2400 and HD 3400, which include one single and one dual-link TMDS transmitters. Each DVI output includes dual-link HDCP encoder with on-chip decipher key. HDMI was introduced, supporting display resolutions up to 1,9201,080, with integrated HD audio controller with 5.1-channel LPCM and AC3 encoding support. Audio is transmitted via DVI port, with specially designed DVI-to-HDMI dongle for HDMI output that carries both audio and video.[5]


The R600 family is called the Radeon HD 2000 series, with the enthusiast segment being the Radeon HD 2900 series which originally comprised the Radeon HD 2900 XT with GDDR3 memory released on May 14, and the higher-clocked GDDR4 version in early July.


Previously there were no HD 2000 series products being offered in the performance segment while ATI used models from the previous generation to address that target market; this situation did not change until the release of variants of the Radeon HD 2900 series, the Radeon HD 2900 Pro and GT, which filled the gap of the performance market for a short period of time.


The Radeon HD 2400 series was based on the codenamed RV610 GPU. It had 180 million transistors on a 65 nm fabrication process. The Radeon HD 2400 series used a 64-bit-wide memory bus.[9] The die size is 85 mm2.[10] The official PCB design implements only a passive-cooling heatsink instead of a fan, and official claims of power consumption are as little as 35 W.[citation needed] The core has 16 kiB unified vertex/texture cache away from dedicated vertex cache and L1/L2 texture cache used in higher end model.


Reports has that the first batch of the RV610 core (silicon revision A12), only being released to system builders, has a bug that hindered the UVD from working properly, but other parts of the die operated normally. Those products were officially supported with the release of Catalyst 7.10 driver, which the cards were named as Radeon HD 2350 series.[11]


Several reports from owners of HD 2400 Pro suggest the card do not fully support hardware decoding for all H.264/VC-1 videos. The device driver, even with the latest stable version, seem to only honor hardware decoding for formats specified in the Blu-ray and HD-DVD specification. As a result of such restriction, the card is not deemed very useful for hardware video decoding since the majority of the H.264/VC-1 videos on the net are not encoded in those formats (even though the hardware itself is fully capable of doing such decoding work). This device driver restriction has led to the development of a third party driver patch, "ExDeus ATI HD Registry Tweak", to unlock the potential of HD 2400 Pro for full support of H.264/VC-1 hardware video decoding.[12][13][14]


The Radeon HD 2600 series was based on the codenamed RV630 GPU and packed 390 million transistors on a 65 nm fabrication process. The Radeon HD 2600-series video cards included GDDR3 support, a 128-bit memory ring bus and 4-phase digital PWM,[9] spanning a die size of 153 mm2.[15] Neither of the GDDR3 reference PCI-E designs required additional power connectors whereas the HD 2600 Pro and XT AGP variants required additional power through either 4-pin or 6-pin power connectors,[16] Official claims state that the Radeon HD 2600 series consumes as little as 45 W of power.[citation needed]


The Radeon HD 2600 X2 is a dual-GPU product which includes 2 RV630 dies on a single PCB with a PCI-E bridge splitting the PCI-E 16 bandwidth into two groups of PCI-E 8 lanes (each 2.0 Gbit/s). The card provides 4 DVI outputs or HDMI outputs via dongle and supports CrossFire configurations. AMD calls this product the Radeon HD 2600 X2 as seen by some vendors and as observed inside the INF file of Catalyst 7.9 version 8.411. Sapphire and other vendors including PowerColor and GeCube have either announced or demonstrated their respective dual GPU (connected by crossfire) products.[17] Catalyst 7.9 added support for this hardware in September 2007. However, AMD did not provide much publicity to promote it. A vendor may offer cards containing 256 MiB, 512 MiB, or 1 GiB of video memory. Although the memory technology utilized is at a vendor's discretion, most vendors have opted for GDDR3 and DDR2 due to lower manufacturing cost and positioning of this product for the mainstream rather than performance market segment and also a big success.


The Radeon HD 2900 series was based on the codenamed R600 GPU and was launched on May 14, 2007. R600 packed 700 million transistors on an 80 nm fabrication process and had a 420 mm2 die size.[18] The Radeon HD 2900 XT was launched with 320 Stream Processors and a core clock of 743 MHz. The initial model was released with 512 MB of GDDR3 clocked at 828 MHz (1,656 MHz effective) with a 512-bit interface. A couple months after release ATI released the 1 GB GDDR4 model with a memory frequency of 1,000 MHz (2,000 MHz effective). Performance was on par compared to the 512 MB card. The HD 2900 XT introduced a lot of firsts. It was the first to implement a digital PWM on board (7-phase PWM), first to use an 8-pin PEG connector, and was the first graphics card from ATI to support DirectX 10.


The Radeon HD 2900 Pro was clocked lower at 600 MHz core and 800 MHz memory (1,600 MHz effective), configured with 512 MB of GDDR3 or 1 GB of GDDR4. It was rumored that some of the 1 GB GDDR4 models were manufactured using a 12" cooler borrowed from the prototype HD 2900 XTX.[19] The HD 2900 Pro had both 256-bit and 512-bit interface options for the 512 MB versions of the card. A few AIB partners offered a black and silver cooler exclusive to the 256-bit model of the Pro.[20][21]


All Mobility Radeon HD 2000 series share the same feature set support as their desktop counterparts, as well as the addition of the battery-conserving PowerPlay 7.0 features, which are augmented from the previous generation's PowerPlay 6.0.


The Mobility Radeon HD 2300 is a budget product which includes UVD in silica but lacks unified shader architecture and DirectX 10.0/SM 4.0 support, limiting support to DirectX 9.0c/SM 3.0 using the more traditional architecture of the previous generation. A high-end variant, the Mobility Radeon HD 2700, with higher core and memory frequencies than the Mobility Radeon HD 2600, was released in mid-December 2007.


The half-generation update treatment had also applied to mobile products. Announced prior to CES 2008 was the Mobility Radeon HD 3000 series. Released in the first quarter of 2008, the Mobility Radeon HD 3000 series consisted of two families, the Mobility Radeon HD 3400 series and the Mobility Radeon HD 3600 series. The Mobility Radeon HD 3600 series also featured the industry's first implementation of on-board 128-bit GDDR4 memory.


About the time of late March to early April, 2008, AMD renewed the device ID list on its website [24] with the inclusion of Mobility Radeon HD 3850 X2 and Mobility Radeon HD 3870 X2 and their respective device IDs. Later in Spring IDF 2008 held in Shanghai, a development board of the Mobility Radeon HD 3870 X2 was demonstrated alongside a Centrino 2 platform demonstration system.[25] The Mobility Radeon HD 3870 X2 was based on two M88 GPUs with the addition of a PCI Express switch chip on a single PCB. The development board used for demonstration was a PCI Express 2.0 16 card, while the final product is expected to be on AXIOM/MXM modules.

3a8082e126
Reply all
Reply to author
Forward
0 new messages