Page 3 of 4 FirstFirst 1234 LastLast
Results 21 to 30 of 39

Thread: X2900XT Review

  1. #21
    Join Date
    Oct 2006
    Location
    jonnyGURU forums, of course!
    Posts
    15,817
    Thanks Thanks Given 
    521
    Thanks Thanks Received 
    168
    Thanked in
    122 Posts

    Default

    Quote Originally Posted by SKYMTL View Post
    And running CoH uses more power than pretty much anything I've tried...period.

    Even running ATItool and Orthos together takes less power than CoH.
    Correct.

  2. #22
    Join Date
    Dec 2006
    Posts
    511
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Here's my take on it-

    Hello all,

    I, like most of you, have been eagerly anticipating the arrival of the latest and greatest ATI offering, the HD 2900XT. Now that we have it in stock at the local e-tailers, a flood of reviews have come out. Everything from performance at insanely high resolutions to discussion about the weight of the cooler has been mentioned, but what's the verdict? With over 20 reviews out and about right now, it's hard for someone to make a snap decision. Well here's my take on it, from the perspective of someone looking for the best graphics solution at the 400$ price point.

    First off, let me say that I'm very picky about the reviews that I accept. I like to see the test system used, detailed explinations of the settings, and graphs that are easy to read. My first and last resource for some of the best reviews on the internet is [H]Enthusiast. Why? They have all of the above in an easy to read format, and enough definitive tests to convince an agnostic.
    (Here they are- http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==)

    So is the latest and greatest ATI offering worth the money spent? To be brief, I'd say no. Now before you get your panties in a wad, let me explain.

    Price Point-
    At 400+ United States Dollars, the HD 2900XT doesn't come cheap. It sits nestled between the 8800GTS 640mb and the 8800GTX, both of which I consider superior solutions for the money. The 8800GTS 640mb can be had for as low as 330$ on newegg AR, and the GTX is closing in on the 500$ price point as well. Sitting pretty between the two means that the HD 2900XT needs to offer some considerable performance boosts compared to the 8800GTS 640mb, and that is not something it has been able to do.

    Performance-
    6 months of delays has netted ATI a card that can't currently compare to it's direct price competitor (only considering MSRP, not actual retail), the 8800GTS 640mb. Anywhere from 10% to 30% under the Nvidia solution is where the HD 2900XT consistently sits, and that's just not enough for 400$. At 300$, this would be an excellent choice.

    Power/Heat-
    Even on an 80nm manufacturing process the HD 2900XT expells more heat than it's direct competitor, and draws nearly 50w more than the GTX. This is frankly unacceptable, especially considering that most GPUs sit right under a very heat sensitive component, the CPU (and in some cases the NB as well).


    So what do we have? A component that you pay more for, and in return get less performance, higher power draw, and higher heat output. I can understand a life-long ATI enthusiast toughing it out with this card, but for anyone else to make the purchase right now is absurd. At the very least, wait for DX10 to see if the HD 2900XT is a complete failure or not.
    The results clearly show this huh? In what, artificial benchmarks like 3dmark?

    Check the review I linked you. I personally consider it one of the most thorough and complete review methodologies on the face of the planet for graphics card testing. It's not as if their results are way from left field either. Many reviews show the HD 2900XT playing second fiddle to the 8800GTS 640mb. Considering that the 8800GTS 640mb comes in at about 50-70$ cheaper, requires less power, and puts off less heat, I'm inclined to say that Nvidia has the winner right now.

    Like I said, DX10 may yield different results. Even if it does, I wouldn't consider "hoping for the best" the right thing to do with 400 hard earned USD. Hell, it could out-perform the GTS 640mb by 10% consistently with better drivers, and I still would say that the performance increase doesn't justify an extra 50-70$, more heat, and more power.

  3. #23
    Join Date
    Oct 2006
    Location
    Montreal, Canada
    Posts
    985
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    The HardOCP review is skewered oddly considering the current crop of ATI drivers seem to be having issues with AA implementation. So, using their example makes your argument moot at best.

    Personally, I hold no love for HardOCP's testing methodology with graphics cards but they are great with everything else. In addition, there seems to be some VERY odd discrepancies when it comes to power consumption.

  4. #24
    Join Date
    Dec 2006
    Posts
    511
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    If all other reviews were indicating that the HD 2900XT was better than the 8800GTS 640mb, I would have been suspicious of this review. Unfortunately that isn't the case.

  5. #25
    Join Date
    Apr 2007
    Posts
    105
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by SKYMTL View Post
    And running CoH uses more power than pretty much anything I've tried...period.

    Even running ATItool and Orthos together takes less power than CoH.
    In my system I get the highest power consumption when I run the Oblivion (and yes, I do have CoH too and I did test it using an actual game replay rather than built-in benchmark)... But then again, I only have a lowly 7900GTO and I use a tweaked oblivion.ini file (to enable multithreading for certain things as well as for general performance improvements... all of them are documented at www.tweakguides.com), so my results can't be really compared with the other systems (which have different cards and more likely use an unmodified version of Oblivion)...

  6. #26
    Join Date
    Oct 2006
    Posts
    1,333
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    I think it is only appropriate that folks wait for a mature driver-set to roll out before calling the card a flop.

  7. #27
    Join Date
    Oct 2006
    Location
    Moderator
    Posts
    4,125
    Thanks Thanks Given 
    19
    Thanks Thanks Received 
    14
    Thanked in
    7 Posts

    Default

    Quote Originally Posted by Super Nade View Post
    I think it is only appropriate that folks wait for a mature driver-set to roll out before calling the card a flop.
    I think manufacturers should ship products that have drivers that work decently

  8. #28
    Join Date
    Apr 2007
    Posts
    105
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by Super Nade View Post
    I think it is only appropriate that folks wait for a mature driver-set to roll out before calling the card a flop.
    A universal (and absolutely redundant) excuse which can theoretically apply to ANY GPU manufacturer (both ATi and Nvidia) at any given time

    Edit: Here's a nice example of what I'm talking about, taken from today's Guru3D review:
    "So we started testing an 8800 GTX with this software [Call of Juarez demo], and yes ... stumbled into bad performance and weird performance issues.....Later that evening NVIDIA noticed that the press obtained the demo and one day later issued a driver which was showing excellent performance and no weird stutters or anything anymore. The driver is the 158.42 driver which be will be released to the public this week."
    Last edited by alexk; 05-14-2007 at 07:14 PM.

  9. #29
    Join Date
    Oct 2006
    Posts
    1,333
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by Spectre View Post
    I think manufacturers should ship products that have drivers that work decently
    Ideally, yes. But this happens everywhere (sadly). Motherboards with crap BIOS'es etc...AMD handled this badly though. Crap drivers + delayed product is not a very good thing.

    From a technical standpoint, I'm just curious to see if there is any merit to AMD's architecture.

  10. #30
    Join Date
    Feb 2007
    Location
    Emily's Shop
    Posts
    95
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Dissappointing, dissappointing, dissappointing.

    You may recall I posted a few months ago saying I was considering the possibility of putting together an R600 crossfire system, subject to favourable benchmarks on release (and I received a tirade of vitriolic abuse as a consequence). Now that the benchmarks have come out... well, I am no longer considering this possibility.

    (sigh)

    What's particularly disheartening is that all of the areas where R600 should be particularly strong have actually turned out to be its most serious weaknesses.

    It ought to be at its best in shader-heavy games, but it isn't.

    The massive memory bandwidth it gets from the 512-bit bus ought to make it excel at very high AA and AF settings, leading to excellent image quality (traditionally ATI's strength); but the reality is that R600 suffers more of a performance hit with increasing AA than G80 does (possibly because the MSAA hardware is actually broken and it has to use the shaders to do AA instead of the ROPs).

    It also takes a much larger hit with high AF settings than G80 does (possibly because of a woeful lack of texturing hardware).

    Its AF is worse than that of G80 in terms of the image quality.

    The new AA modes are superficially interesting but (judging by preliminary reviews) cause too much blurring of the overall image to use in practice.

    The HDMI-audio and hardware MPEG4 and VC-1 decoding ought to make it ideal for an HTPC setup, but (according to the reviews) the noise level that its cooling system generates is so high that it's useless for anything AV-related.

    I've been waiting since January for ATI to release the bloody thing, because my new build really must combine high-end gaming performance with hardware decoding of high-definition MPEG4; but, even now R600 is actually out, there is still no suitable product on the market!



    I suppose I'm just going to have to buckle down and wait another three or four months until R650 appears and hope that it's the product R600 should have been.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •