Palit GeForce 9800 GTX+ Video Card NE/98TX+XT352 |
Reviews - Featured Reviews: Video Cards | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Written by Tim White - Edited by Olin Coles | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Tuesday, 13 January 2009 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
PALiT NE/98TX+XT352Benchmark Reviews has had the wonderful opportunity to review and critique some of the best and most powerful video cards currently available anywhere. These items are very exciting, dreams of playing your favorite video game as large as life and silky smooth abound. Wouldn't it be great if we could all afford one of these monster video cards? Today Benchmark Reviews will take a close look at what I'll call a junior monster. Currently fourth in nvidias lineup of single GPU cards, the Geforce 9800 GTX+ is a card for the mainstream gamer and offers refinements on an already proven design, specifically the wildly popular and powerful G92 core. With a die shrink to 55nm speeds are up and temps are down. We will focus specifically on the PALiT non reference design Geforce 9800 GTX+ NE/98TX+XT352. With it's nearly silent cooler and slightly overclocked core this card should put up some good numbers.
The 9800 GTX+ was brought about to up the stakes as it were in the raging desktop graphics war between nvidia and ATI. Specifically, this was aimed directly for the ATI HD4850 and nvidia has done a good job hitting their target. The HD48XX series was the first real threat in some time to nvidias crown. This kind of competition is good for us gamers and usually benefits the average mainstream gamer the most. The PALiT NE/98TX+XT352 9800 GTX+ is definitely a player in the mainstream gaming market.
PALiT, never one to sit idly by has done what they do. They've taken a proven design and put their spin on it. They have produced some of the most unique and well designed products I've seen. Let's see if this is one of them. About the company: PALiT Multimedia, Inc.
|
Card Model |
9800GTX | 9800 GTX+ |
9800 GTX+ PALiT Design |
Core | G92 65nm |
G92 55nm |
G92 55nm |
Core Speed |
675 Mhz |
738 Mhz |
745 Mhz |
Shader Speed |
1688 Mhz |
1836 Mhz |
1836 Mhz |
Memory Speed |
1100 Mhz |
1100 Mhz |
1100 Mhz |
Closer Look: PALiT 9800 GTX+
The PALiT 9800 GTX+ arrived to me in a fairly large FedEx box. I assumed it must have alot of padding inside to protect the actual product box. I was wrong. It was all product box. The box was massive. The box is bigger than most average mainboard boxes. This isn't necessarily a bad thing, it offers plenty of protection but it seems to use a lot of extra material that is not needed. In this day and age when the buzz word is "Green" this box seems excessive. Don't get me wrong it's very well done with high quality print clearly stating what's inside and with a brief technical description on the rear of the box in 12 different languages. PALiT's love him or hate him mascot Frobot dominating the front cover.
Here we are with the nitty gritty. Included in the box is the graphics card, a DVI to VGA adapter, a DVI to HDMI adapter and the driver CD. Not pictured but included was a card with a product key for a FREE copy of 3DMark Vantage Advanced Edition. Not a hard copy of the benchmark tool, but a product key, so you have to download your copy from www.futuremark.com and enter your key during installation. I thought that was a nice touch and it offers a $20 value to the card. Let's be honest if you are in the market for this type of card, you'll want to know how it stacks up. Since taking these photos I've found stuck under some cardboard fold inside the box a molex to six pin pci-e power adapter and a two pin wire for connecting this graphics card to your soundcard to provide audio to HDMI via the included connector that is not shown in the photo below.
Out of the box you can see this card is clearly not your typical nvidia Geforce 9800GTX+. As a matter of fact it looks almost unrecognizable as a Geforce card and actually resembles an ATI card. The first and most notable difference is the red printed circuit board. Next would be the massive heatsink/fan/shroud on the top. These outside appearances are only the tip of the iceberg hinting at the differences in this and the reference design. PALiT is well known for making true non reference designs. This is nothing new to PALiT; they have been very successful with both nvidia and ATI non reference designs over the past few years.
The rear of the PALiT Designed 9800GTX+ only reinforces my thoughts on this appearing to be an ATI card. This comes as really no surprise as PALiT does make both nvidia and ATI cards. I've seen many an ATI card with this type of sprung heatsink retention bracket and if it works that's all that matters. We here at Benchmarkreviews.com are after performance after all there's no room for fanboys here.
The PALiT 9800GTX+ cooler is a dual slot design and although the entire card is not covered with a shroud and ductwork to direct the air out the rear of the card much of the hot air does manage to make it's way out the vented second slot cover. I'd say it's about a 30/70 split with 30% of the air moving out the rear of the case and 70% blowing down, out and toward the front of the case over the memory and power capacitors and circuitry then unfortunately it's recycled inside the case. This card has two dual link DVI ports and one S-Video port.
The top view of the card shows this cooler to be a dual nickel plated heat pipe design mated with many fine aluminum fins for heat dissipation. After seeing other PALiT designs which completely cover the card with a shroud I'm wondering why the change to an open air design now? Also visible is the dual six pin PCI-E power connectors and near the front the tri-SLI connectors. All pretty much standard offering for any 9800GTX+. So far the card appears to be standard in features but anything but standard in the looks department. Let's move on to the details section and get a better look at what's really going on with this card.
NE/98TX+XT352 Detailed Features
The heart of the beast. Yes yet another G92 core. Reference speeds are up from the 9800 GTX (65nm) but other than a small factory overclock the specs are the same as any other 9800 GTX+. GPU-Z reports this core as a 65nm process but the numbers on the core do indeed checkout to indicate a 55nm core. At least in this case the smaller die does equate to lower temps and allows higher frequency over a standard 9800 GTX (non plus).
The PALiT Design 9800 GTX+ uses the same memory as most other 9800 GTX+ cards do, the very fast Samsung K4J52324QE-BJ08. The BJ08 is rated at 0.8ns or 1200mhz, so it seems this memory is currently running under spec. This should mean there's room left for some overclocking. Free performance; who doesn't love that? Eight of these chips are arranged four above and four to the right of the GPU.
This picture shows PALiT has used two memory heatspreaders. They are attached with screws through that back of the card and have thermal pads running the length of them. They seem to be made of some kind of composite material. As they are not metal I can't help but think they can't be disapating too much heat. I'm left wondering if they might be doing more harm than good. It's possible the heat spreaders actually block the moving air forced down by the GPU fan from cooling the chips.
Below is a comparison of the PALiT design 9800 GTX+ and a reference design 9800 GTX+. The most noteable difference with these two designs is the memory IC layouts. Clearly you can see that the PALiT design seems to be much cleaner if nothing else. I'm no electronics or electrical engineer but it appears that there are fewer components on the PALiT design and the components that are present seem to be much more robust. It is considerably shorter as well.
All of this is cooled with a nickel plated dual heatpipe cooler with aluminum fins and a copper core all mounted to a cast aluminum base. Held on with only four screws. These screws do not have the typical light weight spring like nvidia design but rather a spring steel cross brace that applies many times the amount of pressure against the core. There is also a PWM controlled fan that blows down and sideways. This all adds up to a very cool operation.
In all my testing I've pushed this card pretty hard with Furmark and it topped out at 71 degrees. The card idles around 40 degrees and during normal gaming even for hours on end it usually does not go above 60 degrees. I have to say I'm impressed with this heatsink. I was a bit skeptical when I saw it was only two heatpipes. I've seen similar heatsinks on nvidia 9600GTs so to see this on a 9800GTX+ I was a little worried. This must speak to the success of the die shrink to 55nm and solid design of this heatsink.
Video Card Testing Methodology
At the start of all tests, the previous display adapter driver is uninstalled and trace components are removed using Driver Cleaner Pro.We then restart the computer system to establish our display settings and define the monitor. Once the hardware is prepared, we begin our testing. The synthetic benchmark tests in 3DMark06 will utilize shader models 2.0 and 3.0. In our higher-end VGA products we conduct tests at the following resolutions: 1280x1024 (19" Standard LCD), 1680x1050 (22-24" Widescreen LCD), and 1920x1200 (24-28" Widescreen LCD). In some tests we utilized widescreen monitor resolutions, since more users are beginning to feature these products for their own computing.
Each benchmark test program begins after a system restart, and the very first result for every test will be ignored since it often only caches the test. This process proved extremely important in the World in Conflict and Supreme Commander benchmarks, as the first run served to cache maps allowing subsequent tests to perform much better than the first. Each test is completed five times, with the average results displayed in our article.
Our site polls and statistics indicate that the over 90% of our visitors use their PC for playing video games, and practically every one of you are using a screen resolutions mentioned above. Since all of the benchmarks we use for testing represent different game engine technology and graphic rendering processes, I feel that this battery of tests will provide a diverse range of results for you to gauge performance on your own computer system. With Vista on the loose and many games coming out for DX10 I ran all tests except 3DMark 06 in both Vista Ultimate SP1 64 bit and Windows XP SP3 32 bit.
**For the record the 8800GT SC tested here is nearly identical to any 9800GT and benchmark results would be comparable.
Test System
-
Motherboard: EVGA nforce 750i SLI FTW 123-YW-E175-A1
-
Processor: Intel E8400 Core 2 Duo 3.0 GHz (Overclocked to 3.6 GHz)
-
Optical Drive 1: Samsung SH-S203N SATA DVD R/W
-
Optical Drive 2: Samsung SH-S203N SATA DVD R/W
-
PSU: Corsair TX750 SLI Ready 80 Plus Active PFC Power Supply
-
Operating System: Windows XP Professional SP-3 and Vista Ultimate x64 SP-1
Benchmark Applications
-
3DMark06 v1.1.0 (8x Anti Aliasing & 16x Anisotropic Filtering)
-
Devil May Cry 4 (8x Anti Aliasing and Ultra settings)
-
Crysis v1.21 Benchmark (0x and 4x Anti-Aliasing, High Settings)
-
FarCry 2 Ranch Medium Benchmark (0x and 4x Anti-Aliasing, Ultra settings)
Video Card Test Products
-
EVGA 9600GT KO 512-P3-N865-AR (Forceware v180.48)
-
PALiT 9800GTX+ XNE/98TX+XT352 (Forceware v180.48)
3DMark06 Benchmark Results
3DMark is a computer benchmark by Futuremark (formerly named Mad Onion) to determine the DirectX 9 performance of 3D game performance with graphics cards. 3DMark06 uses advanced real-time 3D game workloads to measure PC performance using a suite of DirectX 9 3D graphics tests, CPU tests, and 3D feature tests.
3DMark06 tests include all new HDR/SM3.0 graphics tests, SM2.0 graphics tests, AI and physics driven single and multiple cores or processor CPU tests and a collection of comprehensive feature tests to reliably measure next generation gaming performance today. Some enthusiasts may note that Benchmark Reviews does not include CPU-bound tests in our benchmark battery, and that only graphic-bound tests are included.
Here at Benchmark Reviews, we believe that synthetic benchmark tools are just as valuable as video games, but only so long as you're comparing apples to apples. Since the same test is applied in the same controlled method with each test run, I believe 3DMark is a very reliable tool for comparing graphic cards against one-another.
More visitors to Benchmark Reviews operate at 1280x1024 resolution than any other, as it represents the native resolution of 19" LCD monitors. Using this resolution as a starting point, the maximum settings were applied to 3dMark06 which for these tests include 8x Anti-Aliasing and 16x Anisotropic Filtering. Low-resolution testing allows the graphics processor to plateau maximum output performance, which thereby shifts demand onto the system components to keep up. At the lower resolutions 3DMark will reflect the GPU's top-end speed in the composite score, indicating full-throttle performance with little load. This makes for a less GPU-dependant test environment, and is helpful in measuring the maximum output performance in the test results.
The following results reflect the average frames per second for the combined Shader Model 2.0 and Shader Model 3.0 tests, rather than a generic score.
With 8x Anti Aliasing and 16x Anisotropic Filtering turned on at 1280 x 1024 the frame rate nearly dropped 50% for the Shader Model 2.0 tests and more than 50% for the Shader Model 3.0 tests. These average frame rates would not be realistically playable in game but this test does give us solid comparison information. Like a staircase, each card has approximately a 10% lead over each card before it. At 1680 x 1050 the steps get a little shorter but the same approximate 10% difference remains and then we see the same results at 1920 x 1200 smaller steps but 10% difference.
From our default settings to the top setting, we see a drop of average frame rate of over 66%! Now again this information is good for comparisons sake but we cannot directly relate this information to any specific game or how these cards may play in the real world on any game for that matter. In the next section we'll see some real game benchmarks that should be fairly accurate of how these cards might perform for you.
Devil May Cry 4 Benchmark Results
Devil May Cry 4 was released on PC in early 2007 as the fourth installment to the Devil May Cry video game series. DMC4 is a direct port from the PC platform to console versions, which operate at the native 720P game resolution with no other platform restrictions. Devil May Cry 4 uses the refined MT Framework game engine, which has been used for many popular Capcom game titles over the past several years.
MT Framework is an exclusive seventh generation game engine built to be used with games developed for the PlayStation 3 and Xbox 360, and PC ports. MT stands for "Multi-Thread", "Meta Tools" and "Multi-Target". Originally meant to be an outside engine, but none matched their specific requirements in performance and flexibility. Games using the MT Framework are originally developed on the PC and then ported to the other two console platforms.
On the PC version a special bonus called Turbo Mode is featured, giving the game a slightly faster speed, and a new difficulty called Legendary Dark Knight Mode is implemented. The PC version also has both DirectX 9 and DirectX 10 mode for Microsoft Windows XP and Vista Operating Systems.
It's always nice to be able to compare the results we receive here at Benchmark Reviews with the results you test for on your own computer system. Usually this isn't possible, since settings and configurations make it nearly difficult to match one system to the next; plus you have to own the game or benchmark tool we used.
Devil May Cry 4 fixes this, and offers a free benchmark tool available for download. Because the DMC4 MT Framework game engine is rather low-demand for today's cutting edge multi-GPU video cards, Benchmark Reviews uses the 1920x1200 resolution to test with 8x AA (highest AA setting available to Radeon HD video cards) and Highest in game settings available. The benchmark runs through four test scenes, but scene #2 and #4 are the ones that usually offer a challenge. Displayed below is our result for the test.
Devil May Cry 4 really posed no problem for any of these cards, whether in DirectX 9 or in DirectX 10, therefore I'll only show the results from the highest resolution tested which is 1920 x 1200. Although Devil May Cry 4 can be played in DX9 or DX10 it's very clear that this game prefers DX9. What is amazing is that while testing in DX10 the results of one card would almost exactly equal the next lesser card playing the same test in DX9. This was done with very little differences noticeable to the naked eye.
Crysis Benchmark Results
Crysis uses a new graphics engine: the CryENGINE2, which is the successor to Far Cry's CryENGINE. CryENGINE2 is among the first engines to use the Direct3D 10 (DirectX10) framework of Windows Vista, but can also run using DirectX9, both on Vista and Windows XP.
Roy Taylor, Vice President of Content Relations at NVIDIA, has spoken on the subject of the engine's complexity, stating that Crysis has over a million lines of code, 1GB of texture data, and 85,000 shaders. To get the most out of modern multicore processor architectures, CPU intensive subsystems of CryENGINE 2 such as physics, networking and sound, have been re-written to support multi-threading.
Crysis offers an in-game benchmark tool, which is similar to World in Conflict. This short test does place some high amounts of stress on a graphics card, since there are so many landscape features rendered. For benchmarking purposes, Crysis can mean trouble as it places a high demand on both GPU and CPU resources. Benchmark Reviews uses the Crysis Benchmark Tool by Mad Boris to test frame rates in batches, which allows the results of many tests to be averaged.
Low-resolution testing allows the graphics processor to plateau its maximum output performance, which thereby shifts demand onto the other system components. At the lower resolutions Crysis will reflect the GPU's top-end speed in the composite score, indicating full-throttle performance with little load. This makes for a less GPU-dependant test environment, and is helpful in creating a baseline for measuring maximum output performance in the next few test results. At the 1280x1024 resolution used by some newer 17" and most 19" monitors, all of the video cards tested performed at very respectable levels.
Crysis is a game that can and will bring almost any card to it's knees. It appears as if Crysis also prefers DX9 over DX10 with DX9 results only losing once to DX10 and that was only when no AA was applied. The 9800 GTX+ does shine amongst the other cards with an average 40% lead over the 9600GT KO edition just slightly less than a 20% lead over the 8800GT SC. The results in the above graph would be playable for all cards if 0x anti aliasing is applied with only the 9800 GTX+ truly surviving 4x AA and remaining playable, barely.
Wow, at 1680 x 1050 and 1920 x 1200 it's painfully clear that none of these cards would survive anti aliasing at all. Not even the 9800 GTX+ has enough horse power to run with 4x AA and honestly would not really be playable at 1920 x 1200 even with 0x AA and it barely survives 1680 x 1050 with 35 fps! Crysis is killer.
Far Cry 2 Benchmark Results
The newest game benchmark tool to Benchmark Reviews: Far Cry 2 is one of the newest First Person Shooter games out there. The game is playable in both DX9 and DX10.
FarCry 2 uses a brand new game engine call the Dunia Engine. Taken from UBI.com: DUNIA ENGINE The Dunia Engine was built specifically for Far Cry 2 by the award-winning Ubisoft Montréal development team. It delivers the most realistic destructible environments, amazing special effects such as dynamic fire propagation and storm effects, real-time night-and-day cycle, dynamic music system, non-scripted enemy A.I. and so much more...
This benchmark tool is very similar in setup and usage to the Mad Boris Crysis benchmark tool. It allows batch runs to be used and gives you maximum and minimum frame rates as well as average frame rates for each individual run as well as average rates for the entire batch. This benchmark tool can also stress your system fairly hard as it runs through many types of terrain and many different types of effects. The test used for this review was Ranch Medium with both 4x AA and 0xAA and Ultra quality everywhere else for each resolution.
Based on the test results charted below it's clear that Far Cry 2 is one game that actually favors DX10 at least when some anti-aliasing is applied. The average frame rate is shown for each resolution in the charts below.
Far Cry 2 results show something very interesting. Compared with the other results especially DMC4 there seems to be a reversal. Finally this game appears to favor DX10 and again, opposite of DMC4 it is really evident when 4x AA is applied. The DirectX 10 frames per second are anywhere from 10% to 15% above the DirectX 9 frames per second. The 9800 GTX+ is putting up playable numbers across the board here with the 8800GT cutting it a bit close when AA is applied.
Not much to say here the 1680 x 1050 results reflect the previous chart. What I thought odd was the 9800 GTX+ DX10 FPS when No AA was applied. There seems to be a sweet spot with this card at this resolution. It's one less average FPS than the 1280 resolution results. Sure this could be an odd occurrence but I've rerun this test several times to only come up with the same results. Far Cry 2 at 1680 x 1050 really likes this PALiT 9800 GTX+.
The biggest jump in DX10 performance comes at 1920 x 1200 resolution when 4x AA is applied. At this resolution DirectX 10 offered approximately 40% increase in FPS over DirectX 9. However it's too little too late. Even with the boost none of the cards survived 4x AA while almost all remain playable when no AA is applied. At 38 frames per second average the PALiT 9800 GTX+ would be playable but barely, for me that is cutting it a little too close.
Keep in mind with these Far Cry 2 tests, everything is set to Ultra or the highest settings regardless of anti aliasing being applied or not. In Far Cry 2 as in any of these games you should easily expect a smoother experience when some of the eye candy is turned down.
VGA Power Consumption
Life is not as affordable as it used to be, and items such as gasoline, natural gas, and electricity all top the list of resources which have exploded in price over the past few years. Add to this the limit of non-renewable resources compared to current demands, and you can see that the prices are only going to get worse. Planet Earth is needs our help, and needs it badly. With forests becoming barren of vegetation and snow capped poles quickly turning brown, the technology industry has a new attitude towards suddenly becoming "green". I'll spare you the powerful marketing hype that I get from various manufacturers every day, and get right to the point: your computer hasn't been doing much to help save energy... at least up until now.
To measure isolated video card power consumption, Benchmark Reviews uses the Kill-A-Watt EZ (model P4400) power meter made by P3 International. A baseline test is taken without a video card installed inside our computer system, which is allowed to boot into Windows and rest idle at the login screen before power consumption is recorded. Once the baseline reading has been taken, the graphics card is installed and the system is again booted into Windows and left idle at the login screen. Our final loaded power consumption reading is taken with the video card running a stress test using FurMark. Below is a chart with the isolated video card power consumption (not system total) displayed in Watts for each specified test product:
VGA Product Description(sorted by combined total power) |
Idle Power |
Loaded Power |
---|---|---|
NVIDIA GeForce GTX 480 SLI Set |
82 W |
655 W |
NVIDIA GeForce GTX 590 Reference Design |
53 W |
396 W |
ATI Radeon HD 4870 X2 Reference Design |
100 W |
320 W |
AMD Radeon HD 6990 Reference Design |
46 W |
350 W |
NVIDIA GeForce GTX 295 Reference Design |
74 W |
302 W |
ASUS GeForce GTX 480 Reference Design |
39 W |
315 W |
ATI Radeon HD 5970 Reference Design |
48 W |
299 W |
NVIDIA GeForce GTX 690 Reference Design |
25 W |
321 W |
ATI Radeon HD 4850 CrossFireX Set |
123 W |
210 W |
ATI Radeon HD 4890 Reference Design |
65 W |
268 W |
AMD Radeon HD 7970 Reference Design |
21 W |
311 W |
NVIDIA GeForce GTX 470 Reference Design |
42 W |
278 W |
NVIDIA GeForce GTX 580 Reference Design |
31 W |
246 W |
NVIDIA GeForce GTX 570 Reference Design |
31 W |
241 W |
ATI Radeon HD 5870 Reference Design |
25 W |
240 W |
ATI Radeon HD 6970 Reference Design |
24 W |
233 W |
NVIDIA GeForce GTX 465 Reference Design |
36 W |
219 W |
NVIDIA GeForce GTX 680 Reference Design |
14 W |
243 W |
Sapphire Radeon HD 4850 X2 11139-00-40R |
73 W |
180 W |
NVIDIA GeForce 9800 GX2 Reference Design |
85 W |
186 W |
NVIDIA GeForce GTX 780 Reference Design |
10 W |
275 W |
NVIDIA GeForce GTX 770 Reference Design |
9 W |
256 W |
NVIDIA GeForce GTX 280 Reference Design |
35 W |
225 W |
NVIDIA GeForce GTX 260 (216) Reference Design |
42 W |
203 W |
ATI Radeon HD 4870 Reference Design |
58 W |
166 W |
NVIDIA GeForce GTX 560 Ti Reference Design |
17 W |
199 W |
NVIDIA GeForce GTX 460 Reference Design |
18 W |
167 W |
AMD Radeon HD 6870 Reference Design |
20 W |
162 W |
NVIDIA GeForce GTX 670 Reference Design |
14 W |
167 W |
ATI Radeon HD 5850 Reference Design |
24 W |
157 W |
NVIDIA GeForce GTX 650 Ti BOOST Reference Design |
8 W |
164 W |
AMD Radeon HD 6850 Reference Design |
20 W |
139 W |
NVIDIA GeForce 8800 GT Reference Design |
31 W |
133 W |
ATI Radeon HD 4770 RV740 GDDR5 Reference Design |
37 W |
120 W |
ATI Radeon HD 5770 Reference Design |
16 W |
122 W |
NVIDIA GeForce GTS 450 Reference Design |
22 W |
115 W |
NVIDIA GeForce GTX 650 Ti Reference Design |
12 W |
112 W |
ATI Radeon HD 4670 Reference Design |
9 W |
70 W |
Card | VGA Idle |
VGA Load |
9600GT | 30W | 120W |
8800GT | 36W | 134W |
9800GTX+ | 44W | 149W |
Although the 9800GTX+ trumps all three cards on the chart for power consumption it would still be near the bottom of the list for cards with comparable or higher performance.
This card does require the use of two 6 pin plugs from the power supply. Keep power requirements in mind when shopping for any new video card. Some people will be purchasing a new PSU along with their cards.
9800 GTX+ Final Thoughts
When I initially started this article I had every intention of including an overclocking section. However, this card although it is slightly overclocked from the factory and it does perform well at default settings. Given that I pressed on and I did get a decent overclock. The best stable overclock I could get was 810mhz for the cpu, 1863mhz for the shaders and 1200mhz for the memory. I ran a set of Far Cry 2 benchmarks and was very sad to see very little improvement. I guess in this day and age it's easy to get your hopes up for free performance. Now I did not test this overclock in a synthetic benchmark like 3DMark 06, it very well have increased the score, but I was more interested in real world results. For now at least, I'm going to call the results of my overclocking inconclusive. Oh well, back to defaults. I'll update any results I get with a longer test drive of this card.
I'm not sure what all that says about me and about the computer industry as well. Should we as consumers just expect that free performance? Should computer hardware companies over engineer their items just to satisfy greedy comsumers? Is it their fault I went into this looking for something for nothing? I don't think so on all questions. This card performs exactly as intended and I can't fault it for not giving me something for nothing.
PALiT 9800GTX+ Conclusion
The 9800 GTX+ arrived safe and sound in a huge box. It was very safely packaged and the box art with Frobot is one of those things that most people will have a love/hate relationship with. Other than Frobot the box art is very much "nvidia". They don't stray that far from home base, so they kind of play it safe, which is ok with me. There is no question when you hold that box it's an nvidia product.
The card itself adds a bit of a system shock. When you know you are looking at a Geforce card but everything about it says it's from the other team it messes with you a little bit. The heatsink is very sturdy and adds a bit of aftermarket look to the card and heatpipes are always cool to see sticking out the side. This is definitely one unique 9800 GTX+!
The card seems built from very good materials. The components seem very robust and seem to be laid out on the card logically for heat dissipation. The RAM with it's heatspreaders which are very well made and attached very well to the card itself initially only added more questions but throughout my overclocking test, RAM was never an issue that held the card back.
The PALiT Design 9800 GTX+ functions just as it is described. Not once did it have even the slightest hiccup or issue. I tested in both Windows XP Pro SP3 and Vista Ultimate 64 bit and it worked smoothly without issue in both operating systems. This card would be my card of choice to put into an HTPC that moonlighted as a gaming machine.
At the time of writing the NE/98TX+XT352 is available at NewEgg for $189.99 ($159.99 after $30 mail in rebate). Value is on par price-wise with other 9800 GTX+ cards and within about $10-$15 of it's true competition the HD4850. If you are looking for a mid-level performer that will play most games great at or below 1920 x 1200 resolution, then by all means give this card a try.
With all the testing and charts and writing behind me, I'm left with a solid performing mid/high level graphics card and one glaring question still in my mind... Why? Why go to the trouble to design a whole new PCB, heatsink/fan, etc. when in the end it's just another 9800 GTX+? It's nearly identical to the specs and performance of any other GTX+ and it's the same price. Is it the noise? I'm not sure how loud the reference design heatsin/fan is, maybe it's louder? Couldn't they have done a redesign on just a cooler? Don't get me wrong, I like the card it performs flawlessly just as it should, but it doesn't appear to stand out from the crowd (other than looks). It just seems to be a lot of work to produce something like this to only have it get in line with the others.
Pros:
+ Unique looks
+ Very cool running
+ Quiet heat sink fan
+ Shorter than typical 9800 GTX+
Cons:
- Unique looks
- Open card design
- Limited overclocks
- Unnecessary
Ratings:
-
Presentation: 8.5
-
Appearance: 8.5
-
Construction: 9.0
-
Functionality: 8.5
-
Value: 8.00
Final Score: 8.5 out of 10.
Questions? Comments? Benchmark Reviews really wants your feedback. We invite you to leave your remarks in our Discussion Forum.
Related Articles:
- ASUS EAH4870 DK TOP 512MB Video Card
- NVIDIA GeForce GTX 560 Ti GF114 Video Card
- ZOTAC GeForce 9800 GTX+ Zone Edition Video Card
- Sapphire 100315L Radeon HD 6850 Video Card
- PNY GeForce GTX 460 OC XLR8 Video Card
- ZOTAC GeForce 8800 GT 512MB AMP! HDMI Video Card
- PowerColor Go! Green Radeon HD5670
- ASUS GeForce GTX-465 Video Card
- HIS Radeon HD 7870 IceQ Turbo 2GB
- Radeon HD 5770 CrossFireX Performance Scaling