Outline ·
[ Standard ] ·
Linear+
GeForce 9 series thread
|
ikanayam
|
Jan 4 2008, 12:45 AM
|
|
QUOTE(emy_xvidia @ Jan 3 2008, 10:40 AM) It might be better then to go for wider buses and faster memory rite? and yes, it's about the GDDR5 is already there.. like Qimonda, they skip the GDDR4 because they see it as a niche product.. it would be a loss for them to produce GDDR4 while Samsung, its main competitor has already announced the GDDR5 memory.. the life cycle of GDDR4 wont be that long, we can see them (Hynix, Samsung, Qimonda) have already made to public about its GDDR5.. yes it will be initially expensive but in large amount of quantities, it will be cheaper.. and GDDR4 memory speed isnt that impressive as you can see it on most ATi products that use GDDR4, the speed doesn't differ much from optimized GDDR3.. Nvidia plans to abandon it because of Qimonda announced the GDDR3 with able-to-achieve clock of 1GHz, almost equal to GDDR4 memory and maybe even more than that, expected to able to reach as high as 1.2GHz (2.4GHz effective)... the need of GDDR4 is lesser coz it's being used by ATI and the market share for ATI cards isn't that impressive either.. there are no clear reasons why they wanna skip the GDDR4, but we can guess it why from the benefits of performance and manufacturing costs of both GDDR4 and GDDR5 as announced by major memory manufacturers.. No point having more bandwidth than you can use, external bandwidth is expensive. R600 was a good example of too much bandwidth. You always want maximum utilization of external bandwidth because it is expensive. Nvidia is not skipping GDDR4 for technical or cost reasons, as i have said before. The relatively slow development of GDDR4, the long life and extended development of GDDR3, and the quicker move to GDDR5 is a result of nv skipping it rather than the reason they are skipping it.
|
|
|
|
|
|
clayclws
|
Jan 4 2008, 12:49 AM
|
|
QUOTE(ikanayam @ Jan 4 2008, 12:45 AM) No point having more bandwidth than you can use, external bandwidth is expensive. R600 was a good example of too much bandwidth. You always want maximum utilization of external bandwidth because it is expensive. Nvidia is not skipping GDDR4 for technical or cost reasons, as i have said before. The relatively slow development of GDDR4, the long life and extended development of GDDR3, and the quicker move to GDDR5 is a result of nv skipping it rather than the reason they are skipping it. The way you put it, they intentionally want to make GDDR4 looks bad...
|
|
|
|
|
|
ikanayam
|
Jan 4 2008, 12:53 AM
|
|
QUOTE(clayclws @ Jan 3 2008, 11:49 AM) The way you put it, they intentionally want to make GDDR4 looks bad... Now i wonder why they would do that.... http://www.driverheaven.net/reviews/X1950XTXreview/GDDR4.pdf
|
|
|
|
|
|
clayclws
|
Jan 4 2008, 12:56 AM
|
|
Surely ATI is the promoter...but they do not pioneer it.
|
|
|
|
|
|
ikanayam
|
Jan 4 2008, 01:02 AM
|
|
QUOTE(clayclws @ Jan 3 2008, 11:56 AM) Surely ATI is the promoter...but they do not pioneer it. pg 2 of the pdf tells me they were a lot more than a promoter.
|
|
|
|
|
|
clayclws
|
Jan 4 2008, 01:08 AM
|
|
QUOTE(ikanayam @ Jan 4 2008, 01:02 AM) pg 2 of the pdf tells me they were a lot more than a promoter. I've seen it before but I've never felt that it would mean so much. GDDR5 is also designed under the same JC-42.3 subcommittee, lead still by Joe Macri. So, is NVIDIA going to abandon GDDR5 as well? This post has been edited by clayclws: Jan 4 2008, 01:14 AM
|
|
|
|
|
|
ikanayam
|
Jan 4 2008, 01:20 AM
|
|
QUOTE(clayclws @ Jan 3 2008, 12:08 PM) I've seen it before but I've never felt that it would mean so much. GDDR5 is also designed under the same JC-42.3 subcommittee, lead still by Joe Macri. So, is NVIDIA going to abandon GDDR5 as well? I think they're not too pleased with GDDR5 as well, but i also think they've made their point clear enough the last round and they probably won't sacrifice their lead for politics.
|
|
|
|
|
|
clayclws
|
Jan 4 2008, 01:23 AM
|
|
Ah well...chairman does not mean much if the rest do not agree or work together. Since GDDR4 kinda flop...they wont repeat the same mistake. Here's hoping for GDDR5 in D9E. I think D9M and D9P will still use GDDR3.
|
|
|
|
|
|
raymond8341
|
Jan 4 2008, 01:07 PM
|
Getting Started

|
NVIDIA's yet-to-be released GeForce 9800 GX2 in the "flesh." We reveal some of the specs and what should be expected. The best way to think of the GeForce 9800 GX2 card is as an 8800 GPU that has been die shrunk to 65nm and placed in an SLI configuration in a "single" card. The 9800 GX2 is very reminiscent of 7950 GX2 of days past. (And we loved the 7950 GX2 at launch, but terrible support and diminishing returns soon painted it as one of NVIDIA's biggest failures since the 5800 series.) The GeForce 9800 GX2 will launch in late February or early March as it now stands and will replace the 8800 Ultra (single GPU) card in NVIDIA's high-end product line up. The 9800 GX2 is said to be at least 30% faster than a 8800 Ultra. While it is not clear from the pictures below, we are told it will support "Quad SLI."  GeForce 9800 GX2 Front  GeForce 9800 GX2 Back GeForce 9800 GX2 :::: 1. 1GB Frame Buffer 2. Two PCBs 3. Two 65nm GPUs Total 4. 256 Stream Processors Total All of the information here comes from sources overseas that we consider trustworthy. Obviously there are some specifications that could be a bit more clear, especially on the memory bus. We would expect to see two 768MB frame buffers per GPU here, but that is not what is spelled out. We are guessing 512MB per GPU currently. Should we learn of any changes and/or corrections we will certainly update this page and inform our readers on our daily news page. Source
|
|
|
|
|
|
Amal
|
Jan 4 2008, 01:32 PM
|
|
QUOTE(raymond8341 @ Jan 4 2008, 01:07 PM) NVIDIA's yet-to-be released GeForce 9800 GX2 in the "flesh." We reveal some of the specs and what should be expected. The best way to think of the GeForce 9800 GX2 card is as an 8800 GPU that has been die shrunk to 65nm and placed in an SLI configuration in a "single" card. The 9800 GX2 is very reminiscent of 7950 GX2 of days past. (And we loved the 7950 GX2 at launch, but terrible support and diminishing returns soon painted it as one of NVIDIA's biggest failures since the 5800 series.) The GeForce 9800 GX2 will launch in late February or early March as it now stands and will replace the 8800 Ultra (single GPU) card in NVIDIA's high-end product line up. The 9800 GX2 is said to be at least 30% faster than a 8800 Ultra. While it is not clear from the pictures below, we are told it will support "Quad SLI."  GeForce 9800 GX2 Front  GeForce 9800 GX2 Back GeForce 9800 GX2 :::: 1. 1GB Frame Buffer 2. Two PCBs 3. Two 65nm GPUs Total 4. 256 Stream Processors Total All of the information here comes from sources overseas that we consider trustworthy. Obviously there are some specifications that could be a bit more clear, especially on the memory bus. We would expect to see two 768MB frame buffers per GPU here, but that is not what is spelled out. We are guessing 512MB per GPU currently. Should we learn of any changes and/or corrections we will certainly update this page and inform our readers on our daily news page. Sourceit doesnt seem soo fast.. not only its a 9 series, but also a GX2! only 30% from 8800ultra?? not worth waiting then.. This post has been edited by Amal: Jan 4 2008, 01:35 PM
|
|
|
|
|
|
Core_Tracer
|
Jan 4 2008, 03:40 PM
|
|
wow...so cool...... hope it wont repeat the 7950GX2 failure......
|
|
|
|
|
|
Ezonizs
|
Jan 4 2008, 03:46 PM
|
|
the picture of the card looks sooo yooonnnggg suuiii lol  ugly
|
|
|
|
|
|
ianho
|
Jan 4 2008, 03:53 PM
|
Cucimangkoklife
|
QUOTE(Core_Trace(R) @ Jan 4 2008, 03:40 PM) wow...so cool...... hope it wont repeat the 7950GX2 failure...... 7950GX2 was not a failure. It was a brilliant card. The failure was Quad SLI with 2 units of 7950GX2 running in SLI.
|
|
|
|
|
|
mADmAN
|
Jan 4 2008, 04:00 PM
|
|
QUOTE(skylinegtr34rule4life @ Jan 4 2008, 03:02 PM) wah why the card all got covered up like the hijjab style martyr  this is like ISLAMIC card  i prefer the back nekkid at least  post reported....ur remark on religion is not funny and i find it discriminatory and offensive. hope to NOT see u around for a couple of days.
|
|
|
|
|
|
Hornet
|
Jan 4 2008, 04:18 PM
|
|
QUOTE(ianho @ Jan 4 2008, 03:53 PM) 7950GX2 was not a failure. It was a brilliant card. The failure was Quad SLI with 2 units of 7950GX2 running in SLI. Many was pissed off after nVidia released the GeForce 8 not too long after they spend on 7950GX2 I'm somewhat disappointed that this is the same old GeForce 8800, and only 30% faster. Something is not right Probably is just another push for GeForce 8 before finally moving on to a new range of GPU, which makes this GX2 a pointless card to spend on. This post has been edited by Hornet: Jan 4 2008, 04:32 PM
|
|
|
|
|
|
t3chn0m4nc3r
|
Jan 4 2008, 07:39 PM
|
|
QUOTE(Hornet @ Jan 4 2008, 05:18 PM) Many was pissed off after nVidia released the GeForce 8 not too long after they spend on 7950GX2 I'm somewhat disappointed that this is the same old GeForce 8800, and only 30% faster. Something is not right Probably is just another push for GeForce 8 before finally moving on to a new range of GPU, which makes this GX2 a pointless card to spend on. no choice... market demand more power... so they give us more power and demand more cash...
|
|
|
|
|
|
ianho
|
Jan 4 2008, 07:39 PM
|
Cucimangkoklife
|
QUOTE(Hornet @ Jan 4 2008, 04:18 PM) Many was pissed off after nVidia released the GeForce 8 not too long after they spend on 7950GX2 I'm somewhat disappointed that this is the same old GeForce 8800, and only 30% faster. Something is not right Probably is just another push for GeForce 8 before finally moving on to a new range of GPU, which makes this GX2 a pointless card to spend on. I was a 7950GX2 user too. I enjoyed it fully for about 6 months n then moved on to the 8800GTX as soon as it appeared. No need to be pissed as LYN Garage Sale is a wonderful place for chronic upgraders to sell off the stuff that they dont use after upgrading.
|
|
|
|
|
|
clayclws
|
Jan 4 2008, 08:21 PM
|
|
QUOTE(ianho @ Jan 4 2008, 07:39 PM) I was a 7950GX2 user too. I enjoyed it fully for about 6 months n then moved on to the 8800GTX as soon as it appeared. No need to be pissed as LYN Garage Sale is a wonderful place for chronic upgraders to sell off the stuff that they dont use after upgrading.  Oh, yeah...I agree with you. The only place to trade off stuffs you don't want and get 1st/2nd hand stuffs cheaply. Sometimes, things you can't get on the market are available as well. I don't think they will market 9800GX2 that fast.
|
|
|
|
|
|
Hyde`fK
|
Jan 4 2008, 11:00 PM
|
|
GeForce 9800 & 9600 specs and availability1. GeForce 9800 GX2 : 2 x 65nm G92, 256 shader processors, 2 x 256-bit memory interface, 1GB memories, Feb-Mar 2. GeForce 9800 GTX : Replace 8800GTX, Feb-Mar 3. GeForce 9800 GT : Replace 8800 GT, Mar-Apr 4. GeForce 9600 GT : Launch on Feb 14th, 64 shader processors, 256-bit memory interface, 512MB memories, 8600GTS< 9600GT<8800GT, US$169 Source : http://forums.vr-zone.com/showthread.php?t=221448
|
|
|
|
|
|
Hornet
|
Jan 5 2008, 11:19 AM
|
|
QUOTE(Hyde`fK @ Jan 4 2008, 11:00 PM) GeForce 9800 & 9600 specs and availability1. GeForce 9800 GX2 : 2 x 65nm G92, 256 shader processors, 2 x 256-bit memory interface, 1GB memories, Feb-Mar 2. GeForce 9800 GTX : Replace 8800GTX, Feb-Mar 3. GeForce 9800 GT : Replace 8800 GT, Mar-Apr 4. GeForce 9600 GT : Launch on Feb 14th, 64 shader processors, 256-bit memory interface, 512MB memories, 8600GTS< 9600GT<8800GT, US$169 Source : http://forums.vr-zone.com/showthread.php?t=221448 Its funny how the put it that way as if G9800GX2 will be a monster lol. Basically, its still 512MB per core. If data has to be duplicated across both cores, the card will just have an effective size of 512MB. The effective memory bus is also 256bits only. Given the fact that its only 30% faster than an 8800Ultra (though this card has 2x SPs), i think it goes to show how inefficient the card is. It 2 PCB slap together, and probably aren't clocked at a very high frequency. 98 is just a new number for the 65nm die shrink. Its still the same 15 months old GPU. I think nVidia are holding back the real next gen chips until ATi release their R700. Till then, its a boring 1H 2008.
|
|
|
|
|