Archive Home arrow Reviews: arrow Video Cards arrow Palit GeForce 9800 GTX+ Video Card NE/98TX+XT352
Palit GeForce 9800 GTX+ Video Card NE/98TX+XT352 E-mail
Reviews - Featured Reviews: Video Cards
Written by Tim White - Edited by Olin Coles   
Tuesday, 13 January 2009

PALiT NE/98TX+XT352

Benchmark Reviews has had the wonderful opportunity to review and critique some of the best and most powerful video cards currently available anywhere. These items are very exciting, dreams of playing your favorite video game as large as life and silky smooth abound. Wouldn't it be great if we could all afford one of these monster video cards? Today Benchmark Reviews will take a close look at what I'll call a junior monster. Currently fourth in nvidias lineup of single GPU cards, the Geforce 9800 GTX+ is a card for the mainstream gamer and offers refinements on an already proven design, specifically the wildly popular and powerful G92 core. With a die shrink to 55nm speeds are up and temps are down. We will focus specifically on the PALiT non reference design Geforce 9800 GTX+ NE/98TX+XT352. With it's nearly silent cooler and slightly overclocked core this card should put up some good numbers.

9800gtx+frontangle.jpg

The 9800 GTX+ was brought about to up the stakes as it were in the raging desktop graphics war between nvidia and ATI. Specifically, this was aimed directly for the ATI HD4850 and nvidia has done a good job hitting their target. The HD48XX series was the first real threat in some time to nvidias crown. This kind of competition is good for us gamers and usually benefits the average mainstream gamer the most. The PALiT NE/98TX+XT352 9800 GTX+ is definitely a player in the mainstream gaming market.

Palit_Logo_400px.png

PALiT, never one to sit idly by has done what they do. They've taken a proven design and put their spin on it. They have produced some of the most unique and well designed products I've seen. Let's see if this is one of them.

About the company: PALiT Multimedia, Inc.

PALiT Multimedia Inc. provides a wide range of industry-leading graphics cards to North and Latin America with a focus on service, support and innovative products. PALiT Multimedia is affiliated with PALiT Microsystems, a world-leading supplier in the design, manufacture, and distribution of PC graphics accelerators which was established in 1988. PALiT is well positioned to maintain an industry leadership due to the vast array of NVIDIA and AMD's ATi VGA products and on-going development efforts.

GeForce 9800 GTX+ Features

  • PALiT own design GeForce 9800 GTX+
  • Higher Core Speed at 745Mhz vs 738Mhz default speed
  • Revolutionary cooling system
  • Lower operating temperature
  • Silent cooler

The NVIDIA GeForce 9800 GTX+ GPU enables full-throttle lifelike game play while providing optimal power management with HybridPower technology. The PureVideo HD engine delivers unmatched video quality while the NVIDIA PhysX technology pushes the limit in ultra realistic game play. And with SLI compatibility, provides amplified performance when coupled with NVIDIA nForce SLI-ready motherboards. Dollar-for-dollar, this GPU packs great performance.

features.jpg

Beyond your imagination - Gaming and video watching capabilities are taken to the next level. With 128 screaming fast cores each running at a record high of 1836MHz, this is our most powerful single GeForce 9800 GPU. The PhysX technology enables a completely new class of physical gaming interaction that will blow you away. And with features such as picture-in-picture content for an interactive movie experience and color stretch video processing for breathtaking picture clarity, this graphics card takes you further than your expectations.

A complete solution - Unlock next generation platform features. With SLI, the GeForce 9800 GTX+ GPU offers increased performance up to 2x in a dual SLI configuration and up to 2.8x in 3-way SLI1 mode. Its HybridPower technology1 delivers graphics performance when you need it and low-power operation when you don't. So, enjoy heart-stopping entertainment or power savings for everyday computing.

NVIDIA PhysX - GeForce GPU support for NVIDIA PhysX technology enabling a total new class of physical gaming interaction for a more dynamic and realistic experience with GeForce.

3-way SLI - Industry leading 3-way NVIDIA SLI technology offers amazing performance scaling by implementing 3-way AFR for the world's fastest gaming solution under Windows Vista with solid, state-of-the-art drivers.

NVIDIA CUDA - CUDA technology unlocks the power of the GPU's processing cores to accelerate the most demanding system tasks-such as video transcoding - delivering up to 7x performance over traditional CPUs...

NE/98TX+XT352 Specifications

  • Bus interface: PCI Express 2.0 Support
  • Memory: 512MB
  • Memory Interface: 256bit
  • Memory Clock: 2200MHz (1100MHz x 2)
  • Core Clock: 745MHz
  • Dual 400MHz RAMDACs
  • Dual Dual-link DVI outputs support 2560x1600 resolution displays
  • 3-way NVIDIA SLI technology
  • 16x full-screen anti-aliasing
  • True 128-bit floating point high dynamic-range (HDR) lighting
  • NVIDIA PureVideoTM HD technology
  • 4-Phase Power Supply
  • NVIDIA unified architecture with GigaThread technology
  • NVIDIA Quantum Effects physics processing technology
  • NVIDIA ForceWare Unified Driver Architecture (UDA)
  • Certified for Microsoft Windows Vista
  • OpenGL 2.1 support
  • NVIDIA Lumenex Engine
  • Discrete, Programmable Video Processor
  • Hardware Decode Acceleration
  • High-Quality Scaling
  • Inverse Telecine (3:2 & 2:2 Pulldown Correction)
  • Bad Edit Correction
  • Integrated SD and HD TV Output
  • Noise Reduction
  • Edge Enhancement
  • Dynamic Contrast Enhancement
  • Dual Stream Decode Acceleration
  • Dual-link HDCP capable
  • This graphics card requires a power supply unit: A minimum 500W power supply unit (with a minimum 36A current rate for 12V) and two 6-pin PCI-E supplementary power connector.
Card Model
9800GTX 9800 GTX+
9800 GTX+ PALiT Design
Core G92 65nm
G92 55nm
G92 55nm
Core Speed
675 Mhz
738 Mhz
745 Mhz
Shader Speed
1688 Mhz
1836 Mhz
1836 Mhz
Memory Speed
1100 Mhz
1100 Mhz
1100 Mhz

Closer Look: PALiT 9800 GTX+

The PALiT 9800 GTX+ arrived to me in a fairly large FedEx box. I assumed it must have alot of padding inside to protect the actual product box. I was wrong. It was all product box. The box was massive. The box is bigger than most average mainboard boxes. This isn't necessarily a bad thing, it offers plenty of protection but it seems to use a lot of extra material that is not needed. In this day and age when the buzz word is "Green" this box seems excessive. Don't get me wrong it's very well done with high quality print clearly stating what's inside and with a brief technical description on the rear of the box in 12 different languages. PALiT's love him or hate him mascot Frobot dominating the front cover.

9800gtx+box.jpg

Here we are with the nitty gritty. Included in the box is the graphics card, a DVI to VGA adapter, a DVI to HDMI adapter and the driver CD. Not pictured but included was a card with a product key for a FREE copy of 3DMark Vantage Advanced Edition. Not a hard copy of the benchmark tool, but a product key, so you have to download your copy from www.futuremark.com and enter your key during installation. I thought that was a nice touch and it offers a $20 value to the card. Let's be honest if you are in the market for this type of card, you'll want to know how it stacks up. Since taking these photos I've found stuck under some cardboard fold inside the box a molex to six pin pci-e power adapter and a two pin wire for connecting this graphics card to your soundcard to provide audio to HDMI via the included connector that is not shown in the photo below.

9800gtx+Included.jpg

Out of the box you can see this card is clearly not your typical nvidia Geforce 9800GTX+. As a matter of fact it looks almost unrecognizable as a Geforce card and actually resembles an ATI card. The first and most notable difference is the red printed circuit board. Next would be the massive heatsink/fan/shroud on the top. These outside appearances are only the tip of the iceberg hinting at the differences in this and the reference design. PALiT is well known for making true non reference designs. This is nothing new to PALiT; they have been very successful with both nvidia and ATI non reference designs over the past few years.

9800gtx+Fullfront.jpg

The rear of the PALiT Designed 9800GTX+ only reinforces my thoughts on this appearing to be an ATI card. This comes as really no surprise as PALiT does make both nvidia and ATI cards. I've seen many an ATI card with this type of sprung heatsink retention bracket and if it works that's all that matters. We here at Benchmarkreviews.com are after performance after all there's no room for fanboys here.

9800gtx+backside.jpg

The PALiT 9800GTX+ cooler is a dual slot design and although the entire card is not covered with a shroud and ductwork to direct the air out the rear of the card much of the hot air does manage to make it's way out the vented second slot cover. I'd say it's about a 30/70 split with 30% of the air moving out the rear of the case and 70% blowing down, out and toward the front of the case over the memory and power capacitors and circuitry then unfortunately it's recycled inside the case. This card has two dual link DVI ports and one S-Video port.

9800gtx+Fulltop.jpg

The top view of the card shows this cooler to be a dual nickel plated heat pipe design mated with many fine aluminum fins for heat dissipation. After seeing other PALiT designs which completely cover the card with a shroud I'm wondering why the change to an open air design now? Also visible is the dual six pin PCI-E power connectors and near the front the tri-SLI connectors. All pretty much standard offering for any 9800GTX+. So far the card appears to be standard in features but anything but standard in the looks department. Let's move on to the details section and get a better look at what's really going on with this card.

NE/98TX+XT352 Detailed Features

The heart of the beast. Yes yet another G92 core. Reference speeds are up from the 9800 GTX (65nm) but other than a small factory overclock the specs are the same as any other 9800 GTX+. GPU-Z reports this core as a 65nm process but the numbers on the core do indeed checkout to indicate a 55nm core. At least in this case the smaller die does equate to lower temps and allows higher frequency over a standard 9800 GTX (non plus).

9800gtx+cpu.jpg

The PALiT Design 9800 GTX+ uses the same memory as most other 9800 GTX+ cards do, the very fast Samsung K4J52324QE-BJ08. The BJ08 is rated at 0.8ns or 1200mhz, so it seems this memory is currently running under spec. This should mean there's room left for some overclocking. Free performance; who doesn't love that? Eight of these chips are arranged four above and four to the right of the GPU.

9800gtx+Ram.jpg

This picture shows PALiT has used two memory heatspreaders. They are attached with screws through that back of the card and have thermal pads running the length of them. They seem to be made of some kind of composite material. As they are not metal I can't help but think they can't be disapating too much heat. I'm left wondering if they might be doing more harm than good. It's possible the heat spreaders actually block the moving air forced down by the GPU fan from cooling the chips.

9800gtx+HSremoveTop.jpg

Below is a comparison of the PALiT design 9800 GTX+ and a reference design 9800 GTX+. The most noteable difference with these two designs is the memory IC layouts. Clearly you can see that the PALiT design seems to be much cleaner if nothing else. I'm no electronics or electrical engineer but it appears that there are fewer components on the PALiT design and the components that are present seem to be much more robust. It is considerably shorter as well.

9800gtx+HSremove.jpg

9800gtx+ref.jpg

All of this is cooled with a nickel plated dual heatpipe cooler with aluminum fins and a copper core all mounted to a cast aluminum base. Held on with only four screws. These screws do not have the typical light weight spring like nvidia design but rather a spring steel cross brace that applies many times the amount of pressure against the core. There is also a PWM controlled fan that blows down and sideways. This all adds up to a very cool operation.

9800gtx+HS3.jpg

In all my testing I've pushed this card pretty hard with Furmark and it topped out at 71 degrees. The card idles around 40 degrees and during normal gaming even for hours on end it usually does not go above 60 degrees. I have to say I'm impressed with this heatsink. I was a bit skeptical when I saw it was only two heatpipes. I've seen similar heatsinks on nvidia 9600GTs so to see this on a 9800GTX+ I was a little worried. This must speak to the success of the die shrink to 55nm and solid design of this heatsink.

Video Card Testing Methodology

At the start of all tests, the previous display adapter driver is uninstalled and trace components are removed using Driver Cleaner Pro.We then restart the computer system to establish our display settings and define the monitor. Once the hardware is prepared, we begin our testing. The synthetic benchmark tests in 3DMark06 will utilize shader models 2.0 and 3.0. In our higher-end VGA products we conduct tests at the following resolutions: 1280x1024 (19" Standard LCD), 1680x1050 (22-24" Widescreen LCD), and 1920x1200 (24-28" Widescreen LCD). In some tests we utilized widescreen monitor resolutions, since more users are beginning to feature these products for their own computing.

Each benchmark test program begins after a system restart, and the very first result for every test will be ignored since it often only caches the test. This process proved extremely important in the World in Conflict and Supreme Commander benchmarks, as the first run served to cache maps allowing subsequent tests to perform much better than the first. Each test is completed five times, with the average results displayed in our article.

Our site polls and statistics indicate that the over 90% of our visitors use their PC for playing video games, and practically every one of you are using a screen resolutions mentioned above. Since all of the benchmarks we use for testing represent different game engine technology and graphic rendering processes, I feel that this battery of tests will provide a diverse range of results for you to gauge performance on your own computer system. With Vista on the loose and many games coming out for DX10 I ran all tests except 3DMark 06 in both Vista Ultimate SP1 64 bit and Windows XP SP3 32 bit.

**For the record the 8800GT SC tested here is nearly identical to any 9800GT and benchmark results would be comparable.

Test System

  • Motherboard: EVGA nforce 750i SLI FTW 123-YW-E175-A1
  • Processor: Intel E8400 Core 2 Duo 3.0 GHz (Overclocked to 3.6 GHz)
  • System Memory: Corsair Dominators DDR2 1066mhz 2 x 2GB
  • Audio: Onboard
  • Disk Drive: Seagate ST325041 250GB 16MB Cache SATA
  • Optical Drive 1: Samsung SH-S203N SATA DVD R/W
  • Optical Drive 2: Samsung SH-S203N SATA DVD R/W
  • Enclosure: Lian Li PC7B Plus II Black Aluminum Mid-Tower ATX Case
  • PSU: Corsair TX750 SLI Ready 80 Plus Active PFC Power Supply
  • Monitor: Apple 23" Cinema HD Display
  • Operating System: Windows XP Professional SP-3 and Vista Ultimate x64 SP-1

Benchmark Applications

  • 3DMark06 v1.1.0 (8x Anti Aliasing & 16x Anisotropic Filtering)
  • Devil May Cry 4 (8x Anti Aliasing and Ultra settings)
  • Crysis v1.21 Benchmark (0x and 4x Anti-Aliasing, High Settings)
  • FarCry 2 Ranch Medium Benchmark (0x and 4x Anti-Aliasing, Ultra settings)

Video Card Test Products

  • EVGA 9600GT KO 512-P3-N865-AR (Forceware v180.48)
  • EVGA 8800GT SC 512-P3-N802-AR (Forceware v180.48)
  • PALiT 9800GTX+ XNE/98TX+XT352 (Forceware v180.48)

3DMark06 Benchmark Results

3DMark is a computer benchmark by Futuremark (formerly named Mad Onion) to determine the DirectX 9 performance of 3D game performance with graphics cards. 3DMark06 uses advanced real-time 3D game workloads to measure PC performance using a suite of DirectX 9 3D graphics tests, CPU tests, and 3D feature tests.

3DMark06 tests include all new HDR/SM3.0 graphics tests, SM2.0 graphics tests, AI and physics driven single and multiple cores or processor CPU tests and a collection of comprehensive feature tests to reliably measure next generation gaming performance today. Some enthusiasts may note that Benchmark Reviews does not include CPU-bound tests in our benchmark battery, and that only graphic-bound tests are included.

Here at Benchmark Reviews, we believe that synthetic benchmark tools are just as valuable as video games, but only so long as you're comparing apples to apples. Since the same test is applied in the same controlled method with each test run, I believe 3DMark is a very reliable tool for comparing graphic cards against one-another.

More visitors to Benchmark Reviews operate at 1280x1024 resolution than any other, as it represents the native resolution of 19" LCD monitors. Using this resolution as a starting point, the maximum settings were applied to 3dMark06 which for these tests include 8x Anti-Aliasing and 16x Anisotropic Filtering. Low-resolution testing allows the graphics processor to plateau maximum output performance, which thereby shifts demand onto the system components to keep up. At the lower resolutions 3DMark will reflect the GPU's top-end speed in the composite score, indicating full-throttle performance with little load. This makes for a less GPU-dependant test environment, and is helpful in measuring the maximum output performance in the test results.

The following results reflect the average frames per second for the combined Shader Model 2.0 and Shader Model 3.0 tests, rather than a generic score.

3DMARK06-1280Default.jpg

3DMARK06-1280.jpg

3DMARK06-1680.jpg

3DMARK06-1920.jpg

With 8x Anti Aliasing and 16x Anisotropic Filtering turned on at 1280 x 1024 the frame rate nearly dropped 50% for the Shader Model 2.0 tests and more than 50% for the Shader Model 3.0 tests. These average frame rates would not be realistically playable in game but this test does give us solid comparison information. Like a staircase, each card has approximately a 10% lead over each card before it. At 1680 x 1050 the steps get a little shorter but the same approximate 10% difference remains and then we see the same results at 1920 x 1200 smaller steps but 10% difference.

From our default settings to the top setting, we see a drop of average frame rate of over 66%! Now again this information is good for comparisons sake but we cannot directly relate this information to any specific game or how these cards may play in the real world on any game for that matter. In the next section we'll see some real game benchmarks that should be fairly accurate of how these cards might perform for you.

Devil May Cry 4 Benchmark Results

Devil May Cry 4 was released on PC in early 2007 as the fourth installment to the Devil May Cry video game series. DMC4 is a direct port from the PC platform to console versions, which operate at the native 720P game resolution with no other platform restrictions. Devil May Cry 4 uses the refined MT Framework game engine, which has been used for many popular Capcom game titles over the past several years.

MT Framework is an exclusive seventh generation game engine built to be used with games developed for the PlayStation 3 and Xbox 360, and PC ports. MT stands for "Multi-Thread", "Meta Tools" and "Multi-Target". Originally meant to be an outside engine, but none matched their specific requirements in performance and flexibility. Games using the MT Framework are originally developed on the PC and then ported to the other two console platforms.

On the PC version a special bonus called Turbo Mode is featured, giving the game a slightly faster speed, and a new difficulty called Legendary Dark Knight Mode is implemented. The PC version also has both DirectX 9 and DirectX 10 mode for Microsoft Windows XP and Vista Operating Systems.

It's always nice to be able to compare the results we receive here at Benchmark Reviews with the results you test for on your own computer system. Usually this isn't possible, since settings and configurations make it nearly difficult to match one system to the next; plus you have to own the game or benchmark tool we used.

Devil May Cry 4 fixes this, and offers a free benchmark tool available for download. Because the DMC4 MT Framework game engine is rather low-demand for today's cutting edge multi-GPU video cards, Benchmark Reviews uses the 1920x1200 resolution to test with 8x AA (highest AA setting available to Radeon HD video cards) and Highest in game settings available. The benchmark runs through four test scenes, but scene #2 and #4 are the ones that usually offer a challenge. Displayed below is our result for the test.

DMC4-1920-8AA.jpg

Devil May Cry 4 really posed no problem for any of these cards, whether in DirectX 9 or in DirectX 10, therefore I'll only show the results from the highest resolution tested which is 1920 x 1200. Although Devil May Cry 4 can be played in DX9 or DX10 it's very clear that this game prefers DX9. What is amazing is that while testing in DX10 the results of one card would almost exactly equal the next lesser card playing the same test in DX9. This was done with very little differences noticeable to the naked eye.

Crysis Benchmark Results

Crysis uses a new graphics engine: the CryENGINE2, which is the successor to Far Cry's CryENGINE. CryENGINE2 is among the first engines to use the Direct3D 10 (DirectX10) framework of Windows Vista, but can also run using DirectX9, both on Vista and Windows XP.

Roy Taylor, Vice President of Content Relations at NVIDIA, has spoken on the subject of the engine's complexity, stating that Crysis has over a million lines of code, 1GB of texture data, and 85,000 shaders. To get the most out of modern multicore processor architectures, CPU intensive subsystems of CryENGINE 2 such as physics, networking and sound, have been re-written to support multi-threading.

Crysis offers an in-game benchmark tool, which is similar to World in Conflict. This short test does place some high amounts of stress on a graphics card, since there are so many landscape features rendered. For benchmarking purposes, Crysis can mean trouble as it places a high demand on both GPU and CPU resources. Benchmark Reviews uses the Crysis Benchmark Tool by Mad Boris to test frame rates in batches, which allows the results of many tests to be averaged.

Low-resolution testing allows the graphics processor to plateau its maximum output performance, which thereby shifts demand onto the other system components. At the lower resolutions Crysis will reflect the GPU's top-end speed in the composite score, indicating full-throttle performance with little load. This makes for a less GPU-dependant test environment, and is helpful in creating a baseline for measuring maximum output performance in the next few test results. At the 1280x1024 resolution used by some newer 17" and most 19" monitors, all of the video cards tested performed at very respectable levels.

Crysis-1280-no-AA-and-4-AA.jpg

Crysis is a game that can and will bring almost any card to it's knees. It appears as if Crysis also prefers DX9 over DX10 with DX9 results only losing once to DX10 and that was only when no AA was applied. The 9800 GTX+ does shine amongst the other cards with an average 40% lead over the 9600GT KO edition just slightly less than a 20% lead over the 8800GT SC. The results in the above graph would be playable for all cards if 0x anti aliasing is applied with only the 9800 GTX+ truly surviving 4x AA and remaining playable, barely.

Crysis-1680-no-AA-and-4-AA.jpg

Crysis-1900-no-AA-and-4-AA.jpg

Wow, at 1680 x 1050 and 1920 x 1200 it's painfully clear that none of these cards would survive anti aliasing at all. Not even the 9800 GTX+ has enough horse power to run with 4x AA and honestly would not really be playable at 1920 x 1200 even with 0x AA and it barely survives 1680 x 1050 with 35 fps! Crysis is killer.


Far Cry 2 Benchmark Results

The newest game benchmark tool to Benchmark Reviews: Far Cry 2 is one of the newest First Person Shooter games out there. The game is playable in both DX9 and DX10.

FarCry 2 uses a brand new game engine call the Dunia Engine. Taken from UBI.com: DUNIA ENGINE The Dunia Engine was built specifically for Far Cry 2 by the award-winning Ubisoft Montréal development team. It delivers the most realistic destructible environments, amazing special effects such as dynamic fire propagation and storm effects, real-time night-and-day cycle, dynamic music system, non-scripted enemy A.I. and so much more...

This benchmark tool is very similar in setup and usage to the Mad Boris Crysis benchmark tool. It allows batch runs to be used and gives you maximum and minimum frame rates as well as average frame rates for each individual run as well as average rates for the entire batch. This benchmark tool can also stress your system fairly hard as it runs through many types of terrain and many different types of effects. The test used for this review was Ranch Medium with both 4x AA and 0xAA and Ultra quality everywhere else for each resolution.

Based on the test results charted below it's clear that Far Cry 2 is one game that actually favors DX10 at least when some anti-aliasing is applied. The average frame rate is shown for each resolution in the charts below.

FarCry2-1280-no-AA-and-4-AA.jpg

Far Cry 2 results show something very interesting. Compared with the other results especially DMC4 there seems to be a reversal. Finally this game appears to favor DX10 and again, opposite of DMC4 it is really evident when 4x AA is applied. The DirectX 10 frames per second are anywhere from 10% to 15% above the DirectX 9 frames per second. The 9800 GTX+ is putting up playable numbers across the board here with the 8800GT cutting it a bit close when AA is applied.

FarCry2-1680-no-AA-and-4-AA.jpg

Not much to say here the 1680 x 1050 results reflect the previous chart. What I thought odd was the 9800 GTX+ DX10 FPS when No AA was applied. There seems to be a sweet spot with this card at this resolution. It's one less average FPS than the 1280 resolution results. Sure this could be an odd occurrence but I've rerun this test several times to only come up with the same results. Far Cry 2 at 1680 x 1050 really likes this PALiT 9800 GTX+.

FarCry2-1920-no-AA-and-4-AA.jpg

The biggest jump in DX10 performance comes at 1920 x 1200 resolution when 4x AA is applied. At this resolution DirectX 10 offered approximately 40% increase in FPS over DirectX 9. However it's too little too late. Even with the boost none of the cards survived 4x AA while almost all remain playable when no AA is applied. At 38 frames per second average the PALiT 9800 GTX+ would be playable but barely, for me that is cutting it a little too close.

Keep in mind with these Far Cry 2 tests, everything is set to Ultra or the highest settings regardless of anti aliasing being applied or not. In Far Cry 2 as in any of these games you should easily expect a smoother experience when some of the eye candy is turned down.

VGA Power Consumption

Life is not as affordable as it used to be, and items such as gasoline, natural gas, and electricity all top the list of resources which have exploded in price over the past few years. Add to this the limit of non-renewable resources compared to current demands, and you can see that the prices are only going to get worse. Planet Earth is needs our help, and needs it badly. With forests becoming barren of vegetation and snow capped poles quickly turning brown, the technology industry has a new attitude towards suddenly becoming "green". I'll spare you the powerful marketing hype that I get from various manufacturers every day, and get right to the point: your computer hasn't been doing much to help save energy... at least up until now.

To measure isolated video card power consumption, Benchmark Reviews uses the Kill-A-Watt EZ (model P4400) power meter made by P3 International. A baseline test is taken without a video card installed inside our computer system, which is allowed to boot into Windows and rest idle at the login screen before power consumption is recorded. Once the baseline reading has been taken, the graphics card is installed and the system is again booted into Windows and left idle at the login screen. Our final loaded power consumption reading is taken with the video card running a stress test using FurMark. Below is a chart with the isolated video card power consumption (not system total) displayed in Watts for each specified test product:

Video Card Power Consumption by Benchmark Reviews

VGA Product Description

(sorted by combined total power)

Idle Power

Loaded Power

NVIDIA GeForce GTX 480 SLI Set
82 W
655 W
NVIDIA GeForce GTX 590 Reference Design
53 W
396 W
ATI Radeon HD 4870 X2 Reference Design
100 W
320 W
AMD Radeon HD 6990 Reference Design
46 W
350 W
NVIDIA GeForce GTX 295 Reference Design
74 W
302 W
ASUS GeForce GTX 480 Reference Design
39 W
315 W
ATI Radeon HD 5970 Reference Design
48 W
299 W
NVIDIA GeForce GTX 690 Reference Design
25 W
321 W
ATI Radeon HD 4850 CrossFireX Set
123 W
210 W
ATI Radeon HD 4890 Reference Design
65 W
268 W
AMD Radeon HD 7970 Reference Design
21 W
311 W
NVIDIA GeForce GTX 470 Reference Design
42 W
278 W
NVIDIA GeForce GTX 580 Reference Design
31 W
246 W
NVIDIA GeForce GTX 570 Reference Design
31 W
241 W
ATI Radeon HD 5870 Reference Design
25 W
240 W
ATI Radeon HD 6970 Reference Design
24 W
233 W
NVIDIA GeForce GTX 465 Reference Design
36 W
219 W
NVIDIA GeForce GTX 680 Reference Design
14 W
243 W
Sapphire Radeon HD 4850 X2 11139-00-40R
73 W
180 W
NVIDIA GeForce 9800 GX2 Reference Design
85 W
186 W
NVIDIA GeForce GTX 780 Reference Design
10 W
275 W
NVIDIA GeForce GTX 770 Reference Design
9 W
256 W
NVIDIA GeForce GTX 280 Reference Design
35 W
225 W
NVIDIA GeForce GTX 260 (216) Reference Design
42 W
203 W
ATI Radeon HD 4870 Reference Design
58 W
166 W
NVIDIA GeForce GTX 560 Ti Reference Design
17 W
199 W
NVIDIA GeForce GTX 460 Reference Design
18 W
167 W
AMD Radeon HD 6870 Reference Design
20 W
162 W
NVIDIA GeForce GTX 670 Reference Design
14 W
167 W
ATI Radeon HD 5850 Reference Design
24 W
157 W
NVIDIA GeForce GTX 650 Ti BOOST Reference Design
8 W
164 W
AMD Radeon HD 6850 Reference Design
20 W
139 W
NVIDIA GeForce 8800 GT Reference Design
31 W
133 W
ATI Radeon HD 4770 RV740 GDDR5 Reference Design
37 W
120 W
ATI Radeon HD 5770 Reference Design
16 W
122 W
NVIDIA GeForce GTS 450 Reference Design
22 W
115 W
NVIDIA GeForce GTX 650 Ti Reference Design
12 W
112 W
ATI Radeon HD 4670 Reference Design
9 W
70 W
* Results are accurate to within +/- 5W.

Card VGA Idle
VGA Load
9600GT 30W 120W
8800GT 36W 134W
9800GTX+ 44W 149W

Although the 9800GTX+ trumps all three cards on the chart for power consumption it would still be near the bottom of the list for cards with comparable or higher performance.

This card does require the use of two 6 pin plugs from the power supply. Keep power requirements in mind when shopping for any new video card. Some people will be purchasing a new PSU along with their cards.

9800 GTX+ Final Thoughts

When I initially started this article I had every intention of including an overclocking section. However, this card although it is slightly overclocked from the factory and it does perform well at default settings. Given that I pressed on and I did get a decent overclock. The best stable overclock I could get was 810mhz for the cpu, 1863mhz for the shaders and 1200mhz for the memory. I ran a set of Far Cry 2 benchmarks and was very sad to see very little improvement. I guess in this day and age it's easy to get your hopes up for free performance. Now I did not test this overclock in a synthetic benchmark like 3DMark 06, it very well have increased the score, but I was more interested in real world results. For now at least, I'm going to call the results of my overclocking inconclusive. Oh well, back to defaults. I'll update any results I get with a longer test drive of this card.

I'm not sure what all that says about me and about the computer industry as well. Should we as consumers just expect that free performance? Should computer hardware companies over engineer their items just to satisfy greedy comsumers? Is it their fault I went into this looking for something for nothing? I don't think so on all questions. This card performs exactly as intended and I can't fault it for not giving me something for nothing.

9800gtx+rearangle.jpg

PALiT 9800GTX+ Conclusion

The 9800 GTX+ arrived safe and sound in a huge box. It was very safely packaged and the box art with Frobot is one of those things that most people will have a love/hate relationship with. Other than Frobot the box art is very much "nvidia". They don't stray that far from home base, so they kind of play it safe, which is ok with me. There is no question when you hold that box it's an nvidia product.

The card itself adds a bit of a system shock. When you know you are looking at a Geforce card but everything about it says it's from the other team it messes with you a little bit. The heatsink is very sturdy and adds a bit of aftermarket look to the card and heatpipes are always cool to see sticking out the side. This is definitely one unique 9800 GTX+!

The card seems built from very good materials. The components seem very robust and seem to be laid out on the card logically for heat dissipation. The RAM with it's heatspreaders which are very well made and attached very well to the card itself initially only added more questions but throughout my overclocking test, RAM was never an issue that held the card back.

The PALiT Design 9800 GTX+ functions just as it is described. Not once did it have even the slightest hiccup or issue. I tested in both Windows XP Pro SP3 and Vista Ultimate 64 bit and it worked smoothly without issue in both operating systems. This card would be my card of choice to put into an HTPC that moonlighted as a gaming machine.

At the time of writing the NE/98TX+XT352 is available at NewEgg for $189.99 ($159.99 after $30 mail in rebate). Value is on par price-wise with other 9800 GTX+ cards and within about $10-$15 of it's true competition the HD4850. If you are looking for a mid-level performer that will play most games great at or below 1920 x 1200 resolution, then by all means give this card a try.

With all the testing and charts and writing behind me, I'm left with a solid performing mid/high level graphics card and one glaring question still in my mind... Why? Why go to the trouble to design a whole new PCB, heatsink/fan, etc. when in the end it's just another 9800 GTX+? It's nearly identical to the specs and performance of any other GTX+ and it's the same price. Is it the noise? I'm not sure how loud the reference design heatsin/fan is, maybe it's louder? Couldn't they have done a redesign on just a cooler? Don't get me wrong, I like the card it performs flawlessly just as it should, but it doesn't appear to stand out from the crowd (other than looks). It just seems to be a lot of work to produce something like this to only have it get in line with the others.

Pros:

+ Unique looks
+ Very cool running
+ Quiet heat sink fan
+ Shorter than typical 9800 GTX+

Cons:

- Unique looks
- Open card design
- Limited overclocks
- Unnecessary

Ratings:

  • Presentation: 8.5
  • Appearance: 8.5
  • Construction: 9.0
  • Functionality: 8.5
  • Value: 8.00

Final Score: 8.5 out of 10.

Questions? Comments? Benchmark Reviews really wants your feedback. We invite you to leave your remarks in our Discussion Forum.


Related Articles:
 

Comments have been disabled by the administrator.

Search Benchmark Reviews

Like Benchmark Reviews on FacebookFollow Benchmark Reviews on Twitter