Page 2 of 4 FirstFirst 1234 LastLast
Results 11 to 20 of 39

Thread: X2900XT Review

  1. #11
    Join Date
    Apr 2007
    Posts
    105
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by SKYMTL View Post
    lots of stuff which can be summarized in single sentence
    So in other words, you don't have an exact power consumption of a single 8800GTX card? Just like I thought

    In a more serious note, all these "show a full system power consumption" benchmarks are pretty useless (in my opinion) when it comes to trying to determine the power consumption of the video cards - first of all, I don't really know HOW exactly these total power loads were measured (did they use the "min/max" function on multimeter (if it even has one) or did they simply stared at its LCD once in a while and recorded the highest wattage they managed to see? They don't exactly say that in any of the reviews you've linked and I wasn't physically present at these reviews, so why should I assume they did it the "right way"?). Second of all, ALL of the games/benchmarks they've used for testing might simply NOT provide maximum load for these cards - after all they are OLD, DX9-based games/demos which do not use any of the DX10-specific hardware functions that are present on the G80/R600 GPU's. Last of all, I'd rather trust ATi than some random review site since ATi themselves have ALL the proper equipment (and software, including possibly the custom-made DX10 benchmarks) to test the absolute maximum power consumption of each reference card (or maybe even GPU) itself, without trying to do some random approximations based on some specific system's total power consumption

  2. #12
    Join Date
    Oct 2006
    Location
    In yer 'fridge, eatin' yer pizza
    Posts
    1,785
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Most sites use a power angel (or kill-a-watt) and read the wattage display while running a benchmark. After that they divide the AC wattage used by the efficiency of their particular PSU to arrive at the DC wattage used. Simple no?

  3. #13
    Join Date
    Oct 2006
    Location
    Montreal, Canada
    Posts
    985
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Or they take a card they KNOW the wattage requirement of and then measure the full system load with that card installed. Then they put the new card in the same system, take the reading and subtract the difference to see how much more or less the new card uses.

  4. #14
    Join Date
    Oct 2006
    Location
    Montreal, Canada
    Posts
    985
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by alexk View Post
    So in other words, you don't have an exact power consumption of a single 8800GTX card? Just like I thought
    We do have it:

    http://www.xbitlabs.com/articles/vid...roundup_6.html

    Xbitlabs does VERY thorough tests when it comes to power consumption.

  5. #15
    Join Date
    Apr 2007
    Posts
    105
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by SKYMTL View Post
    We do have it:

    http://www.xbitlabs.com/articles/vid...roundup_6.html

    Xbitlabs does VERY thorough tests when it comes to power consumption.
    I can see that they did put some effort to try to measure it objectively (such as actually trying to measure the power delivered through PCI-E slot by adding some shunts into the power lines of the slot and connecting them to multimeter), unfortunately their results still do not correlate with any of the various results I've found on internet (claiming anywhere from 145w (some Beyond3D article) to 177w (Wikipedia table) and all the way up to 200w (PC World India article)) plus they only done tests with 3DMarks, which is, as I said before, highly unlikely to provide a full load to such GPU (no DX10 features), so I can't really consider them as valid results

  6. #16
    Join Date
    Apr 2007
    Posts
    105
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    B.t.w, whoever was interested in results of STALKER benchmarks:
    http://enthusiast.hardocp.com/articl...VudGh1c2lhc3Q=
    "Min/Max/Avg framerates for the ATI Radeon HD 2900 XT were 4/40/21.3. The GeForce 8800 GTS 640 MB had 24/82/38.9 framerates in this game"

    Like I said before, STALKER is a very Nvidia-optimized game, so these results are not really surprising and I highly doubt that they will change with any further driver releases
    Last edited by alexk; 05-14-2007 at 12:05 AM.

  7. #17
    Join Date
    Feb 2007
    Location
    The Netherlands
    Posts
    53
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Here's the giant review (26 pages!) of the HD 2900XT by GURU3D:
    http://www.guru3d.com/article/Videocards/431/1/
    (Dynamic lighting is still disabled in the S.T.A.L.K.E.R: Shadow of Chernobyl test).

    Power consumption HD 2900XT (vs. Geforce 8800):
    http://www.guru3d.com/article/Videocards/431/13/

  8. #18
    Join Date
    Oct 2006
    Location
    Moderator
    Posts
    4,126
    Thanks Thanks Given 
    19
    Thanks Thanks Received 
    14
    Thanked in
    7 Posts

    Default

    Quote Originally Posted by alexk View Post
    B.t.w, whoever was interested in results of STALKER benchmarks:
    http://enthusiast.hardocp.com/articl...VudGh1c2lhc3Q=
    "Min/Max/Avg framerates for the ATI Radeon HD 2900 XT were 4/40/21.3. The GeForce 8800 GTS 640 MB had 24/82/38.9 framerates in this game"
    Seems crossfire didn't work so well at 55c.......

  9. #19
    Join Date
    Oct 2006
    Location
    jonnyGURU forums, of course!
    Posts
    16,059
    Thanks Thanks Given 
    538
    Thanks Thanks Received 
    268
    Thanked in
    195 Posts

    Default

    Quote Originally Posted by SKYMTL View Post
    We do have it:

    http://www.xbitlabs.com/articles/vid...roundup_6.html

    Xbitlabs does VERY thorough tests when it comes to power consumption.
    They do. But their figures need to be used in context. That's why they always group them together.

    You can't assume that the numbers that Xbit puts up are "maximum" or even "average." All we can gather from the numbers is that a 7950 GX2 draws more power running the same application as an 8800GTX. That's all.

    From Xbit:

    We loaded the GPU by launching the first SM3.0/HDR graphics test from 3DMark06 and running it in a loop at 1600x1200 resolution and with enabled 4x full-screen antialiasing and 16x anisotropic filtering. The Peak 2D load was created by means of the 2D Transparent Windows test from Futuremark’s PCMark05 benchmarking suite. We did not test the XFX GeForce 8800 GTX XXX Edition separately. Its shader processors, which are the biggest consumer in the GeForce 8800, are clocked at the same frequency as the reference card’s, while the growth of the frequencies of the main domain and memory is not as big as to influence the card’s power draw much.
    So there's no way that's the most power the cards draw.

    And FYI: If you're using 3DMark'06 to benchmark power consumption, the scene with the Zeppelin flying around the sea dragon (called "Canyon Fight") uses more power than anything else in 3DMark'06.

  10. #20
    Join Date
    Oct 2006
    Location
    Montreal, Canada
    Posts
    985
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    And running CoH uses more power than pretty much anything I've tried...period.

    Even running ATItool and Orthos together takes less power than CoH.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •