Welcome Guest ( Log In | Register )

89 Pages « < 9 10 11 12 13 > » Bottom

Outline · [ Standard ] · Linear+

 GeForce 9 series thread

views
     
ikanayam
post Jan 3 2008, 02:16 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(clayclws @ Jan 3 2008, 12:32 AM)
Jen-Hsun Huang is Taiwanese...and they did release GeForce 4 didn't they? GDDR5 is simply a much cheaper move rather than GDDR4. Ikanayam mentioned that there's some political stuff involved with why NVIDIA ain't shifting towards GDDR4...you'll need to ask him more on that.
*
Maybe i missed something, but i don't see where this "much cheaper" idea is coming from...
clayclws
post Jan 3 2008, 02:28 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(ikanayam @ Jan 3 2008, 02:16 PM)
Maybe i missed something, but i don't see where this "much cheaper" idea is coming from...
*
Err...mistake. Much rationale move in terms of money. GDDR4 is not that much of an improvement over GDDR3 but GDDR5 is, since both Qimonda and Samsung managed to achieve the specs of GDDR5 while manufacturing GDDR4.

This post has been edited by clayclws: Jan 3 2008, 02:28 PM
ikanayam
post Jan 3 2008, 02:39 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(clayclws @ Jan 3 2008, 01:28 AM)
Err...mistake. Much rationale move in terms of money. GDDR4 is not that much of an improvement over GDDR3 but GDDR5 is, since both Qimonda and Samsung managed to achieve the specs of GDDR5 while manufacturing GDDR4.
*
GDDR3->GDDR4 is as big an improvement as GDDR4->GDDR5. However the fact that nvidia is strongly against it likely affected GDDR4 development, while it helped push GDDR3 speeds higher and sped up development of GDDR5. No one wants to bother too much with a product if there isn't much volume, which is why you see some manufacturers skipping GDDR4 completely.

edit: It is also very likely that the NV chips since G80 have GDDR4 support in case they needed it to be competitive, but they never needed it because things worked out as they wanted.

This post has been edited by ikanayam: Jan 3 2008, 02:42 PM
clayclws
post Jan 3 2008, 02:45 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(ikanayam @ Jan 3 2008, 02:39 PM)
GDDR3->GDDR4 is as big an improvement as GDDR4->GDDR5. However the fact that nvidia is strongly against it likely affected GDDR4 development, while it helped push GDDR3 speeds higher and sped up development of GDDR5. No one wants to bother too much with a product if there isn't much volume, which is why you see some manufacturers skipping GDDR4 completely.

edit: It is also very likely that the NV chips since G80 have GDDR4 support in case they needed it to be competitive, but they never needed it because things worked out as they wanted.
*
That's the reason why it is much rationale move in terms of money (mistakenly for much cheaper): volume. So, why is it that NVIDIA does not support GDDR4? Is AMD going to support GDDR5?
ikanayam
post Jan 3 2008, 03:31 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(clayclws @ Jan 3 2008, 01:45 AM)
That's the reason why it is much rationale move in terms of money (mistakenly for much cheaper): volume. So, why is it that NVIDIA does not support GDDR4? Is AMD going to support GDDR5?
*
AMD has nothing against gddr5. And about nvidia's reasons, try google maybe, but i don't know if that helps...
emy_xvidia
post Jan 3 2008, 06:05 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(ikanayam @ Jan 3 2008, 02:16 PM)
Maybe i missed something, but i don't see where this "much cheaper" idea is coming from...
*
GDDR5 should be cheaper coz it's made on a lower production process than GDDR4, expected to be 4 times faster than GDDR3 and uses 20% lower in its power consumption..

i guess since Samsung and Qimonda has already developed the GDDR5, i can see why Nvidia wants to jump directly to GDDR5.. smile.gif
ianho
post Jan 3 2008, 06:17 PM

Cucimangkoklife
Group Icon
VIP
15,705 posts

Joined: Mar 2005
From: Tg. Rambutan
LOL. So many ppl here keep saying wait for this wait for that so u can play Crysis nicely with full eye candy on. Cevat Yerli already said in the interviews that Crysis is here to torture hi end rigs for another 3 years. That's about right, based on what Far Cry did to our rigs. After 3 years oni can play Far Cry with everything turned on at 2560x1600 with nice smooth frames. Sooooo, those of u still waiting can continue waiting for 3 more years.
ikanayam
post Jan 3 2008, 06:19 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(emy_xvidia @ Jan 3 2008, 05:05 AM)
GDDR5 should be cheaper coz it's made on a lower production process than GDDR4, expected to be 4 times faster than GDDR3 and uses 20% lower in its power consumption..

i guess since Samsung and Qimonda has already developed the GDDR5, i can see why Nvidia wants to jump directly to GDDR5..  smile.gif
*
Smaller processes don't necessarily make it cheaper, especially not initially, and since they'll probably ship in larger capacities. Newer faster memory will always command a price premium, and this will remain so in 2008. So i don't see the cheaper cards going to this memory type yet because it's not a cost savings for them.
It will end up about 3-4 times faster, max, and that power consumption figure is not useful without knowing what speed it was compared with. You won't magically get 4 times faster right off the bat, that will only be as it matures, closer to the end of its life cycle.

This post has been edited by ikanayam: Jan 3 2008, 06:22 PM
emy_xvidia
post Jan 3 2008, 06:29 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(ikanayam @ Jan 3 2008, 06:19 PM)
Smaller processes don't necessarily make it cheaper, especially not initially, and since they'll probably ship in larger capacities. Newer faster memory will always command a price premium, and this will remain so in 2008. So i don't see the cheaper cards going to this memory type yet because it's not a cost savings for them.
It will end up about 3-4 times faster, max, and that power consumption figure is not useful without knowing what speed it was compared with. You won't magically get 4 times faster right off the bat, that will only be as it matures, closer to the end of its life cycle.
*
yeah, agreed with what u say there, but the mass production of GDDR5 by Samsung and Qimonda will be one of the reason why it will be cheaper than GDDR4..

GDDR4 was first used on the X1950XTX in around August 2007.. now it's being used too but only by fewer models of ATI series.. the need for GDDR4 is not that high, and now since Nvidia has already made its decision to jump to GDDR5, it will, if not all, affect the price for GDDR4 as well as the GDDR5 itself..

Nvidia influences the market so greatly because it holds a greater proportion of the gfx current market, if i'm not mistaken, in the ratio of 7:3, which was originally by 9:1 because of their success in their 8 series currently..

This post has been edited by emy_xvidia: Jan 3 2008, 06:35 PM
ikanayam
post Jan 3 2008, 08:43 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(emy_xvidia @ Jan 3 2008, 05:29 AM)
yeah, agreed with what u say there, but the mass production of GDDR5 by Samsung and Qimonda will be one of the reason why it will be cheaper than GDDR4..

GDDR4 was first used on the X1950XTX in around August 2007.. now it's being used too but only by fewer models of ATI series.. the need for GDDR4 is not that high, and now since Nvidia has already made its decision to jump to GDDR5, it will, if not all, affect the price for GDDR4 as well as the GDDR5 itself..

Nvidia influences the market so greatly because it holds a greater proportion of the gfx current market, if i'm not mistaken, in the ratio of 7:3, which was originally by 9:1 because of their success in their 8 series currently..
*
Seems you're repeating what i said earlier without proper understanding. It's not going to be cheaper than GDDR4 right off the bat either. GDDR5 will likely command a price premium over other memory types throughout most if not all of 2008. Once again, price is not the main reason they're jumping to it, to make things absolutely clear.
jinaun
post Jan 3 2008, 08:45 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
well.. i read in vrzone that geforce9 series will still be dx10 not dx10.1

perhaps refresh of geforce9 will include 10.1..

lol
emy_xvidia
post Jan 3 2008, 08:50 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(ikanayam @ Jan 3 2008, 08:43 PM)
Seems you're repeating what i said earlier without proper understanding. It's not going to be cheaper than GDDR4 right off the bat either. GDDR5 will likely command a price premium over other memory types throughout most if not all of 2008. Once again, price is not the main reason they're jumping to it, to make things absolutely clear.
*
i'm not saying they jump directly to GDDR5 because of the price alone.. Hynix, Samsung and Qimonda has already in the state of mass production of the GDDR5, it means the technology is already there, then why must they have still to be bounded by GDDR4 rite? smile.gif
ikanayam
post Jan 3 2008, 08:56 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(emy_xvidia @ Jan 3 2008, 07:50 AM)
i'm not saying they jump directly to GDDR5 because of the price alone.. Hynix, Samsung and Qimonda has already in the state of mass production of the GDDR5, it means the technology is already there, then why must they have still to be bounded by GDDR4 rite?  smile.gif
*
It's a cost vs benefit thing. It may be cheaper to go with wider buses and slower memory. You don't just jump to the newest most expensive things just because they're "already there". Nvidia doesn't even jump to the newest process technology as soon as it's there. They prefer to stay slightly behind because it's cheaper and less problematic. Let the early adopters deal with the problems first.
clayclws
post Jan 3 2008, 09:50 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


I think he meant to say that GDDR4 was only mass subscribed by AMD.ATI and GDDR5 will be mass subscribed by AMD.ATI AND NVIDIA (not sure about Intel with their new graphics technology). So essentially, it will be cheaper faster than GDDR4 was. I reckon AMD.ATI will stick with GDDR4 for the time bieng while NVIDIA may use GDDR3 first before jumping into GDDR5 for GeForce 9. Just a thought...not a fact.


Added on January 3, 2008, 9:51 pmAnd I still can't figure out the reasoning behind NVIDIA not using GDDR4.

This post has been edited by clayclws: Jan 3 2008, 09:51 PM
emy_xvidia
post Jan 3 2008, 11:40 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(ikanayam @ Jan 3 2008, 08:56 PM)
It's a cost vs benefit thing. It may be cheaper to go with wider buses and slower memory. You don't just jump to the newest most expensive things just because they're "already there". Nvidia doesn't even jump to the newest process technology as soon as it's there. They prefer to stay slightly behind because it's cheaper and less problematic. Let the early adopters deal with the problems first.
*
It might be better then to go for wider buses and faster memory rite? and yes, it's about the GDDR5 is already there.. like Qimonda, they skip the GDDR4 because they see it as a niche product.. it would be a loss for them to produce GDDR4 while Samsung, its main competitor has already announced the GDDR5 memory.. the life cycle of GDDR4 wont be that long, we can see them (Hynix, Samsung, Qimonda) have already made to public about its GDDR5.. yes it will be initially expensive but in large amount of quantities, it will be cheaper..

and GDDR4 memory speed isnt that impressive as you can see it on most ATi products that use GDDR4, the speed doesn't differ much from optimized GDDR3.. Nvidia plans to abandon it because of Qimonda announced the GDDR3 with able-to-achieve clock of 1GHz, almost equal to GDDR4 memory and maybe even more than that, expected to able to reach as high as 1.2GHz (2.4GHz effective)... the need of GDDR4 is lesser coz it's being used by ATI and the market share for ATI cards isn't that impressive either..

QUOTE(clayclws @ Jan 3 2008, 09:50 PM)
I think he meant to say that GDDR4 was only mass subscribed by AMD.ATI and GDDR5 will be mass subscribed by AMD.ATI AND NVIDIA (not sure about Intel with their new graphics technology). So essentially, it will be cheaper faster than GDDR4 was. I reckon AMD.ATI will stick with GDDR4 for the time bieng while NVIDIA may use GDDR3 first before jumping into GDDR5 for GeForce 9. Just a thought...not a fact.


Added on January 3, 2008, 9:51 pmAnd I still can't figure out the reasoning behind NVIDIA not using GDDR4.
*
there are no clear reasons why they wanna skip the GDDR4, but we can guess it why from the benefits of performance and manufacturing costs of both GDDR4 and GDDR5 as announced by major memory manufacturers..
clayclws
post Jan 3 2008, 11:50 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


Also, maybe because there is one less GDDR4 RAM manufacturer - Qimonda - so the price for GDDR4 was not competitive enough for NVIDIA to consider.
emy_xvidia
post Jan 3 2008, 11:59 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(clayclws @ Jan 3 2008, 11:50 PM)
Also, maybe because there is one less GDDR4 RAM manufacturer - Qimonda - so the price for GDDR4 was not competitive enough for NVIDIA to consider.
*
not sure about this but possible.. i guess Nvidia wants to produce GDDR5-based cards so badly to force people to abandon the GDDR4-based cards which ATI is currently selling.. lol.. laugh.gif

obviously people will prefer something faster and newer rite? brows.gif tongue.gif
skylinegtr34rule4life
post Jan 4 2008, 12:00 AM

13k elite :P
********
Senior Member
13,340 posts

Joined: Feb 2005
From: back from vacation XD



another dirty tricks by NVIDIA again laugh.gif but why DX10 not DX10.1 doh.gif

This post has been edited by skylinegtr34rule4life: Jan 4 2008, 12:00 AM
emy_xvidia
post Jan 4 2008, 12:02 AM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(skylinegtr34rule4life @ Jan 4 2008, 12:00 AM)
another dirty tricks by NVIDIA again laugh.gif but why DX10 not DX10.1 doh.gif
*
yeah, DirectX 10 based games is hardly playable with current generation of cards, then the 10.1 is out already.. rclxub.gif what a crap.. doh.gif
clayclws
post Jan 4 2008, 12:22 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(emy_xvidia @ Jan 3 2008, 11:59 PM)
not sure about this but possible.. i guess Nvidia wants to produce GDDR5-based cards so badly to force people to abandon the GDDR4-based cards which ATI is currently selling.. lol..  laugh.gif

obviously people will prefer something faster and newer rite?  brows.gif  tongue.gif
*
GDDR4 is better than GDDR3 but no, NVIDIA didn't use it. Nor the thousands of people using 8800GT, GTS and GTX.

QUOTE(skylinegtr34rule4life @ Jan 4 2008, 12:00 AM)
another dirty tricks by NVIDIA again laugh.gif but why DX10 not DX10.1 doh.gif
*
How sure are you that NVIDIA is playing tricks?

QUOTE(emy_xvidia @ Jan 4 2008, 12:02 AM)
yeah, DirectX 10 based games is hardly playable with current generation of cards, then the 10.1 is out already..  rclxub.gif  what a crap..  doh.gif
*
Bioshock is very playable with full details on 1920x1200 with a 8800GT.

89 Pages « < 9 10 11 12 13 > » Top
 

Change to:
| Lo-Fi Version
0.0170sec    0.41    6 queries    GZIP Disabled
Time is now: 12th December 2025 - 06:08 PM