Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

 300w for future graphics card, 300W PCIe 1.1

views
     
TSjinaun
post May 23 2007, 12:54 PM, updated 19y ago

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
Power consumption of up-to-date graphics cards have risen from around 60W to about 160W in the last three years and it is highly likely that this trend will continue. Even though PCI Express 2.0 is set to overcome the current power consumption limits, PCI special interest group (PCI SIG) is working on a standard for add-on cards that fit into current platforms, but consume up to 300W.

PCI SIG recently announced that it was working on electromechanical specification of PCI Express add-on cards that addresses graphics power and thermals greater than those supported by PCI Express CEM 1.1 and PCI Express 150W 1.0. Its purpose is to provide additional capabilities for PCI Express graphics within the existing framework of an evolutionary strategy that is based on existing motherboard form factors.

Currently both ATI Radeon HD 2900 XT and Nvidia GeForce 8800 GTX/Ultra fit into present PCI Express 1.1/1.0a specification. As is known, according to PCI Express 1.0a standards, a device that consumes maximum of 150W should consume maximum 75W from the mainboard's slot and maximum 75W from power supply unit's PCI Express connector. The GeForce 8800 GTX/Ultra has power consumption of about 130W and while it devours more than 80W of power from power supply unit (PSU), it uses two connectors, which means that there should be no problems. The Radeon HD 2900 XT consumes approximately 160W and uses one 6-pin and one 8-pin connectors while sucking up to approximately 110W from a PSU, according to measurements by X-bit labs. However, exact specifications for graphics cards that consume more than 150W and more than 75W from PCI Express connector are not defined.

The new PCI Express 225W/300W graphics card electromechanical specification is primarily designed to deliver additional electrical power to a PCI Express graphics add-in card and provide increased card volume for the management of thermals, according to a statement by PCI SIG.

The 30-day member review for the version 0.5 draft specification ends on Wednesday, June 6, 2007.


http://www.xbitlabs.com/news/video/display...0522115634.html
watcher
post May 24 2007, 09:53 AM

:hehe: One of the good guys...
*****
Senior Member
922 posts

Joined: Dec 2005
Damn... why nowadays gfx cards consume more & more power as their performance increase??? rclxub.gif

... unlike CPUs which performance increase amidst wif still some drop in power consumption???

Dun NVIDIA & ATI understand 2 go green amidst now d world faces power & pollution crisis??? vmad.gif Dun they giv a damn of d environment???!!!

Dun diz guys strive 4 better 'performance per Watt' rather > pure performance? hmm.gif

<sigh>
SUSjoe_star
post May 24 2007, 09:55 AM

Serving the Servants
******
Senior Member
1,810 posts

Joined: Mar 2007
Nonsense wei, I dont wanna get a 1000W psu in a years time!!!
DarkForce
post May 24 2007, 10:31 AM

short thinking syndrom
****
Senior Member
544 posts

Joined: Jul 2006
From: DarkForce@LYN.net



lolx... scared adi... need 1kw psu...

LExus65
post May 24 2007, 10:43 AM

Old Gezzer.....
******
Senior Member
1,995 posts

Joined: May 2005


wah that time the mother board need how many pin socket..............

why they cant make more efficient GPU ar ?? CPU and rams is getting less and less power hungry but GPU getting more and more
hiroshi
post May 24 2007, 11:27 AM

Regular
******
Senior Member
1,424 posts

Joined: Jan 2003
Future graphic card looks like this? Water cooling system is required?

user posted image
Aoshi_88
post May 24 2007, 11:43 AM

Talking isn't difficult. Speaking is.
*******
Senior Member
4,670 posts

Joined: Dec 2004


As the manufacturing process becomes smaller and smaller... more energy is lost in terms of heat because the copper wires cannot conduct enough electricity due to them being so thin.

So to overcome the scale at which the card is manufactured at, more power is needed thus causing the GPU to require more watts. This is why you have a lot of energy loss as the scale becomes smaller and smaller.


EDIT: this is my explanation for the increased consumption. Very basic explanation, it does have some holes in it so take with a pinch of salt.

This post has been edited by Aoshi_88: May 24 2007, 11:44 AM
ikanayam
post May 24 2007, 11:56 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(Aoshi_88 @ May 23 2007, 10:43 PM)
As the manufacturing process becomes smaller and smaller... more energy is lost in terms of heat because the copper wires cannot conduct enough electricity due to them being so thin.

So to overcome the scale at which the card is manufactured at, more power is needed thus causing the GPU to require more watts. This is why you have a lot of energy loss as the scale becomes smaller and smaller.
EDIT: this is my explanation for the increased consumption. Very basic explanation, it does have some holes in it so take with a pinch of salt.
*
Only the low level local interconnect scales with the process. The higher level interconnect generally does not shrink with a process shrink.

There are many reasons, one of them is that power density is still increasing with process shrinks. You get ~30% less power per transistor but you can fit ~2x the number of transistors in the same area. And these chips keep getting bigger too.
stone13
post May 24 2007, 11:59 AM

Getting Started
**
Junior Member
272 posts

Joined: Jun 2006


kind of scarry when u think about the graphic card will eat all your psu power

This post has been edited by stone13: May 24 2007, 12:00 PM
PGV3910
post May 24 2007, 01:01 PM

Regular
******
Senior Member
1,885 posts

Joined: Jan 2007
logic..high performance=high power consumptions..
same as vehicle..high cc=high fuel usage.
for my thinking ler.. hmm.gif
Ryo
post May 24 2007, 01:07 PM

.:: Rise Lord Vader ::.
*****
Senior Member
871 posts

Joined: Jun 2005
From: A Galaxy Far Far Away...


i think later on 1KW PSU is a must is power consumption keeps increasing like this
PGV3910
post May 24 2007, 01:23 PM

Regular
******
Senior Member
1,885 posts

Joined: Jan 2007
QUOTE(Ryo @ May 24 2007, 01:07 PM)
i think later on 1KW PSU is a must is power consumption keeps increasing like this
*
hmmm..agree with u bro..mati laaa.. cry.gif
SUSdattebayo
post May 24 2007, 02:51 PM

Look at all my stars!!
*******
Senior Member
5,366 posts

Joined: Aug 2005


which means in the future all laptops are not competent of highend 3d games anymore hmm.gif
t3chn0m4nc3r
post May 24 2007, 09:04 PM

Teh Necron Lord
*******
Senior Member
4,139 posts

Joined: Sep 2006
From: Internet


QUOTE(dattebayo @ May 24 2007, 03:51 PM)
which means in the future all laptops are not competent of highend 3d games anymore hmm.gif
*
which laptop compatible of high end 3d games available in malaysia anyway... blink.gif
a1098113
post May 24 2007, 10:26 PM

~Retired~
*******
Senior Member
3,119 posts

Joined: May 2007
From: Home


i think hmm just get what you need and dont fall for consumerism. It doesnt benefit us but it benefits the marketeers. When your need only requires a 256ddr2 gpu, dont fall for consumerism by getting a 8800gts or whatever. well, if you think about it, the lesser you buy highly priced items, the time will play a good game with the suppliers and the supply demand method will definitely reduce the card prices down. But honestly, who needs such power consumption, when people can barely earn enough to pay their monthly electrical bills? hmm.gif

My two cents worth.
SUSdattebayo
post May 24 2007, 11:15 PM

Look at all my stars!!
*******
Senior Member
5,366 posts

Joined: Aug 2005



greatgreedyguts
post May 25 2007, 12:56 PM

When in doubt, toss your wife
****
Senior Member
666 posts

Joined: Sep 2005
From: melaka



DELL is already having a 8600 Go laptop
But the heat generated is unimaginable.
skydna
post May 25 2007, 02:24 PM

Getting Started
**
Junior Member
236 posts

Joined: Jan 2003
i hope graphic card can make as external device...........
ronho
post May 31 2007, 09:43 PM

Regular
******
Senior Member
1,356 posts

Joined: Dec 2006
From: Subang


cannot imagine why the manufacturers dont come u with a green card.. if we all make noise and not buy then they will have no choice,,right?
badguy86
post May 31 2007, 10:50 PM

Getting Started
**
Junior Member
292 posts

Joined: Sep 2006
From: Kuching, Sarawak, Malaysia



They should bring the graphic technology towards environmental friendly!!! smile.gif

2 Pages  1 2 >Top
 

Change to:
| Lo-Fi Version
0.0195sec    0.79    5 queries    GZIP Disabled
Time is now: 21st December 2025 - 12:27 AM