Page 1 of 3 123 LastLast
Results 1 to 10 of 26

Thread: HarHarHar! - GF8800 to use 400+ Watts!

  1. #1
    Join Date
    Oct 2006
    Location
    Your Mom's bed
    Posts
    295
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default HarHarHar! - GF8800 to use 400+ Watts!

    Make sure to read the first comment - Funny as hell!

    http://www.dailytech.com/article.aspx?newsid=4442

  2. #2
    Join Date
    Oct 2006
    Posts
    1,333
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    The first post does make many pertinent points. I may have to start working on my fast breeder reactor to power my rig. With shrinking process technologies, one would expect the power consumption to reverse itself or at the very least stay put. Now this is really getting out of hand.

  3. #3
    Join Date
    Oct 2006
    Location
    jonnyGURU forums, of course!
    Posts
    16,153
    Thanks Thanks Given 
    543
    Thanks Thanks Received 
    287
    Thanked in
    207 Posts

    Default

    This sucks.

    Die sizes and power consumption are supposed to be LESS in the evolution of a product, not more. Otherwise, you're just "overclocking" existing product with better yields.

    What really pisses me off is that Intel "quietly" caved in on the 240VA rail limit for the ATX12V standard. Rail separation has good intentions, but with power hungry GPU's, the 240VA limit simply doesn't work (unless there's a separate rail for each PCI-e like the OCZ GameXstream and Ultra X-Finity 800W)

    I'm not saying I "fault" Intel, but if they stood their ground on the ATX12V standard of no rail over 240VA, then GPU manufactures would have to improve power efficiency because no ATX12V power supply would power their high end cards!

  4. #4
    Join Date
    Oct 2006
    Posts
    1,333
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    I see the GPU market heading the way of the Prescott, where clock frequency was bumped up by adding more and more pipelines, which ofcourse necessitated humongous L2 and even L3 caches to prevent pipeline stalls. All this extra silicon was consuming a lot of power, which was exarcerbated by leakage problems and this equated to massive heat issues.

    While Intel moved onto smaller process technology, power budgets came crashing down only upon implimenting an efficient architecture. The Netburst crap really didn't cut it anymore.

    However, with GPU's I don't think these guys have the freedom to restructure architecture as CPU's, because they have to adhere to DX and OpenGL models. Plus, the GDDR in use is not the most energy efficient device on board. ATI did overhaul their architecture, but sadly I don't see the horizon in terms of hogging power. With GDDR4 and a smaller process (x1950 series), power consumption seems to have levelled out.

    I am really scared for ATI now, they have had the hottest and most inefficient cards on the market so far. DX 10 could make everybody sit up and say FU, this is enough!
    Software readings are crap.

  5. #5
    Join Date
    Oct 2006
    Location
    Moderator
    Posts
    4,126
    Thanks Thanks Given 
    19
    Thanks Thanks Received 
    14
    Thanked in
    7 Posts

    Default

    As I have become famous....or infamous for saying elsewhere....I'll believe those power figures when I see them (do remember the chicken littling of 6800 power consumption which i have running on a 270w Topower). A recommendation for a 450w PSU for hte whole system when the system recommendation for a 7950 is 400w hardly makes for the 350w GPU's people were predicting.

  6. #6
    Join Date
    Oct 2006
    Location
    Moderator
    Posts
    231
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Nothing new there, but nVidia is sure making this DX10 revolution hard for us PSU "gurus".
    -dB

  7. #7
    Join Date
    Oct 2006
    Location
    In yer 'fridge, eatin' yer pizza
    Posts
    1,785
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Quote Originally Posted by Super Nade View Post
    I see the GPU market heading the way of the Prescott, where clock frequency was bumped up by adding more and more pipelines, which ofcourse necessitated humongous L2 and even L3 caches to prevent pipeline stalls. All this extra silicon was consuming a lot of power, which was exarcerbated by leakage problems and this equated to massive heat issues.

    While Intel moved onto smaller process technology, power budgets came crashing down only upon implimenting an efficient architecture. The Netburst crap really didn't cut it anymore.

    However, with GPU's I don't think these guys have the freedom to restructure architecture as CPU's, because they have to adhere to DX and OpenGL models. Plus, the GDDR in use is not the most energy efficient device on board. ATI did overhaul their architecture, but sadly I don't see the horizon in terms of hogging power. With GDDR4 and a smaller process (x1950 series), power consumption seems to have levelled out.

    I am really scared for ATI now, they have had the hottest and most inefficient cards on the market so far. DX 10 could make everybody sit up and say FU, this is enough!
    There's a glaring misinterpritation here. Intel did not add more pipelines, they added more stages to the existing pipes. This means that each calculation had more stages in a longer pipe to go through. nVidia is adding parallel pipelines to execute more calculations simultaneously, not to speed up the clockspeed. In essesnce it's similar to adding another core where the pipes (cores) are parallel to one another.

  8. #8
    Join Date
    Oct 2006
    Posts
    68
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default If it gets that bad

    They'll just have to put the power connector right on the back of the card like monitors have! 120-240VAC direct input.

    The brick idea -sans voodoo 6000 - may make a comeback.

    Imagine the UPS manufacturers needing to add GPU power plant onto the list along with laser printers of things NOT to plug into your UPS!

  9. #9
    Join Date
    Oct 2006
    Posts
    41
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    The brick idea is alive and kicking!


    Anyone with two GF8800GTX's in SLI with a quad core kentfield is going to need a LOT of power.

    Overclocked kentfield system with 1.65vcore, 2GB ram, and a LOW power X300 video card pulls ~300 watts at idle.

  10. #10
    Join Date
    Oct 2006
    Posts
    244
    Thanks Thanks Given 
    0
    Thanks Thanks Received 
    0
    Thanked in
    0 Posts

    Default

    Kentsfield uses the same power as the 965xe, the Pentium D 3.73ghz w/HT, which has been the closest thing to quad core so far, with 2 actual cores and 2 'virtual' cores.

    In all honesty, who cares about how much power your desktop uses. If you can afford a kentsfield and two 8800's in SLI, then you clearly can afford the small jump in your electric bill. Personally I'm thinking of investing in a really powerful PSU right now, with these cards coming out. Oh yea I just signed the lease on my new apartment (getting out of my parents home!) and electricity is included in the rent.

Similar Threads

  1. Watts
    By ehume in forum PC Power Supply Discussion
    Replies: 6
    Last Post: 05-21-2017, 11:21 AM
  2. How much PSU watts do I actually need?
    By lemon123 in forum PC Power Supply Discussion
    Replies: 14
    Last Post: 06-07-2016, 08:59 AM
  3. How many watts should I use?
    By barriohippie in forum PC Power Supply Discussion
    Replies: 3
    Last Post: 11-28-2014, 07:13 PM
  4. How many watts do I need? Help!
    By Apolus in forum PC Power Supply Discussion
    Replies: 4
    Last Post: 07-24-2014, 04:00 AM
  5. 430 watts enough?
    By stalker88 in forum PC Power Supply Discussion
    Replies: 8
    Last Post: 05-13-2007, 12:06 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •