Page 4 of 4 FirstFirst ... 234
Results 31 to 39 of 39

Thread: X2900XT Review

  1. #31
    Join Date
    Apr 2007
    Posts
    105
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by nicolasb View Post
    my new build really must combine high-end gaming performance with hardware decoding of high-definition MPEG4; but, even now R600 is actually out, there is still no suitable product on the market!
    So you think that Nvidia's cards (such as 8800GTS or GTX) can't provide these 2 things together

  2. #32
    Join Date
    Feb 2007
    Location
    Emily's Shop
    Posts
    95
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by alexk View Post
    So you think that Nvidia's cards (such as 8800GTS or GTX) can't provide these 2 things together
    Well, no, they can't. Otherwise I wouldn't have a problem, would I?

    The newer mid-range G8x chips do offer hardware MPEG4-decode, but the original G80 ones (8800GTS and 8800GTX) don't. The 8600 and below don't offer the level of gaming performance I'm after.

  3. #33
    Join Date
    Oct 2006
    Location
    Romania
    Posts
    1,105
    Thanks Thanks Given 
    4
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

  4. #34
    Join Date
    Apr 2007
    Posts
    105
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by nicolasb View Post
    Well, no, they can't. Otherwise I wouldn't have a problem, would I?

    The newer mid-range G8x chips do offer hardware MPEG4-decode, but the original G80 ones (8800GTS and 8800GTX) don't.
    WTF are you talking about ALL of the Nvidia's cards starting from certain GF68xx models have a MPEG-4 (for all codec implementation, such as h.264, VC-1 and MPEG-2, regardless of the bitstream or profiles) hardware acceleration:
    http://www.nvidia.com/page/purevideo_support.html

    Same goes for ALL of the 8800 series of cards:
    http://www.nvidia.com/page/8800_features.html

    Even my 7900GTO does the hardware acceleration/offloading when I play 1080p h.264 movies (from Apple's website) using WMP11 (which supports DXVA and, therefore, Nvidia's PureVideo) - I get less than 45% CPU utilization with my AMD 3800+ CPU

    The only significant thing that newer cards (8600 and 8500) add is a better offloading in terms of CPU utilization (up to 100% offloading), nothing more.

  5. #35
    Join Date
    Feb 2007
    Location
    Emily's Shop
    Posts
    95
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by alexk View Post
    Even my 7900GTO does the hardware acceleration/offloading when I play 1080p h.264 movies (from Apple's website) using WMP11 (which supports DXVA and, therefore, Nvidia's PureVideo) - I get less than 45% CPU utilization with my AMD 3800+ CPU
    Yes, but I'm not interested only in downloaded material, I'm also interested in HD-DVD and BluRay discs. If we check some figures from Anandtech you'll see that CPU utilisation during BluRay playback can hit 99% using a Core 2 E6300 and an 8800GTX card. That's much too high for me to be comfortable with.

    http://www.anandtech.com/video/showdoc.aspx?i=2886&p=4 (third graph)

    You are correct that the 8800 cards offer some MPEG4 decoding facilities, but they don't offer enough. If they did as much as G84/G86 do I would be happy. But they don't.

  6. #36
    Join Date
    Apr 2007
    Posts
    105
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    You shouldn't trust these single outdated benchmarks so blindly First of all, their measured "maximum CPU utilization" could be a single momentary spike which caused NO frame drops and NO playback slowdowns (even if it was 100%). Second of all, that test is outdated - here's a newer benchmark, using latest Nvidia drivers and latest version of PowerDVD:
    http://www.anandtech.com/video/showdoc.aspx?i=2977&p=5
    http://www.anandtech.com/video/showdoc.aspx?i=2977&p=4
    Notice the (meaningless) "maximum" graphs - they don't reach even the 90% for a 8800GTX card. And that is with h.264 (the most demanding out of all MPEG-4 codecs).
    Anyway, if you have a motherboard with dual 8x/16x PCI-E slots, you can always buy a cheap (they are about $90 right now) 8500-series card and use it as a secondary card just for a MPEG-4 movie playback... Or use these $$$ for a faster CPU. You really don't have to depend on a single company (ATi) for all of your demands

  7. #37
    Join Date
    Oct 2006
    Location
    In yer 'fridge, eatin' yer pizza
    Posts
    1,785
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    nicoslab, are you overclocking that 6300 at all? If you notice, core speeds do play a part in the maximum CPU utilization on that chart. While it may look grim for a stock clocked E6300 the outcome may well be a lot better for someone running theirs with the chip balls to the wall.

  8. #38
    Join Date
    Feb 2007
    Location
    Emily's Shop
    Posts
    95
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by madmat View Post
    nicoslab, are you overclocking that 6300 at all?
    It's a hypothetical 6300, which could well turn into a hypothetical 6600 when no one is looking. I'll probably be overclocking when gaming, but not necessarily all the time. Noise is more of an issue for video playback, and the system is going to be noisy enough with a high-end video card in there without running the CPU cooler flat-out as well. If anything, it would be nice to have the freedom to underclock when desired.
    Quote Originally Posted by alexk
    http://www.anandtech.com/video/showdoc.aspx?i=2977&p=4
    Notice the (meaningless) "maximum" graphs - they don't reach even the 90% for a 8800GTX card.
    Personally I think the fact that it does reach 89% is rather more telling. That's just too close to the point where it's dropping frames for me to be comfortable with. Even if it only drops them occasionally, once is enough. Compare that with the figure for the 8600GTS on the same graph - just 33% CPU. Big difference.

    It isn't actually just CPU offload that I'm interested in anyway, although that's the most important thing. The HDMI implementation on R600 is much better than it is on G80. Proper audio via HDMI, an implementation of HDCP that isn't broken in the way that it is in G80 (ask anyone who has tried to play back HDCP-protected video on a 30" Dell monitor!) and so on. Individually these are not make-or-break features, but they add up.

  9. #39
    Join Date
    Apr 2007
    Posts
    105
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by nicolasb View Post
    Compare that with the figure for the 8600GTS on the same graph - just 33% CPU. Big difference.
    Yea, the CPU usage difference is big, but I still do not see how it translates into stuff like dropped frames (or slowed down playback). Nobody has shown that, for example, even with 90% CPU utilization you will get at least one dropped frame

    Quote Originally Posted by nicolasb View Post
    It isn't actually just CPU offload that I'm interested in anyway, although that's the most important thing. The HDMI implementation on R600 is much better than it is on G80. Proper audio via HDMI, an implementation of HDCP that isn't broken in the way that it is in G80 (ask anyone who has tried to play back HDCP-protected video on a 30" Dell monitor!) and so on. Individually these are not make-or-break features, but they add up.
    Well, if I would be playing the HD movies on my PC, I would just use a certain software (Wikipedia mentions it) which would allow me to playback ANY Blu-Ray or HD-DVD on my PC without HDCP getting into way (regardless of the display and video output/input type) I see nothing wrong with doing that since I want to watch my (legally bought) movies the way I want it and all this "copy protection" BS only benefits the companies who invent these things, nobody else (not movie studios, not movie actors/artists) About sound over HDMI - I guess you want to hear something other than plain old DD/DTS sound? If yes and your receiver already supports stuff like DTS-HD and Dolby TrueHD decoding, then yea, I can see the usefulness for having a proper HDMI audio output.
    Anyway, personally I would (and probably will) rather buy a stand-alone HD-DVD and/or Blu-Ray (or a combo player) - in my opinion, it is much more convenient to use: you get a proper remote control, you're basically guaranteed to have a proper performance (no dropped frames ever), you get less noise (these things do have fans but they are much, MUCH more quiet than any average PC, especially any PC with this ATi's new abomination (2900XT)), they eat significantly less power and they provide a plenty of video/audio outputs for just about any type of display (even for the ones with a component input only - I don't think that there's any "copy protection" implemented for it yet, and probably will never be). These things aren't cheap, but neither are the similar PC drives Of course, I am speaking from a position of a person with a proper entertainment system (with no silly things such as PC's being used for it and with plenty of room to put all the stand-alone players near it)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •