Welcome Guest ( Log In | Register )

89 Pages « < 10 11 12 13 14 > » Bottom

Outline · [ Standard ] · Linear+

 GeForce 9 series thread

views
     
ikanayam
post Jan 4 2008, 12:45 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(emy_xvidia @ Jan 3 2008, 10:40 AM)
It might be better then to go for wider buses and faster memory rite? and yes, it's about the GDDR5 is already there.. like Qimonda, they skip the GDDR4 because they see it as a niche product.. it would be a loss for them to produce GDDR4 while Samsung, its main competitor has already announced the GDDR5 memory.. the life cycle of GDDR4 wont be that long, we can see them (Hynix, Samsung, Qimonda) have already made to public about its GDDR5.. yes it will be initially expensive but in large amount of quantities, it will be cheaper..

and GDDR4 memory speed isnt that impressive as you can see it on most ATi products that use GDDR4, the speed doesn't differ much from optimized GDDR3.. Nvidia plans to abandon it because of Qimonda announced the GDDR3 with able-to-achieve clock of 1GHz, almost equal to GDDR4 memory and maybe even more than that, expected to able to reach as high as 1.2GHz (2.4GHz effective)... the need of GDDR4 is lesser coz it's being used by ATI and the market share for ATI cards isn't that impressive either..
there are no clear reasons why they wanna skip the GDDR4, but we can guess it why from the benefits of performance and manufacturing costs of both GDDR4 and GDDR5 as announced by major memory manufacturers..
*
No point having more bandwidth than you can use, external bandwidth is expensive. R600 was a good example of too much bandwidth. You always want maximum utilization of external bandwidth because it is expensive.

Nvidia is not skipping GDDR4 for technical or cost reasons, as i have said before. The relatively slow development of GDDR4, the long life and extended development of GDDR3, and the quicker move to GDDR5 is a result of nv skipping it rather than the reason they are skipping it.
clayclws
post Jan 4 2008, 12:49 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(ikanayam @ Jan 4 2008, 12:45 AM)
No point having more bandwidth than you can use, external bandwidth is expensive. R600 was a good example of too much bandwidth. You always want maximum utilization of external bandwidth because it is expensive.

Nvidia is not skipping GDDR4 for technical or cost reasons, as i have said before. The relatively slow development of GDDR4, the long life and extended development of GDDR3, and the quicker move to GDDR5 is a result of nv skipping it rather than the reason they are skipping it.
*
The way you put it, they intentionally want to make GDDR4 looks bad...
ikanayam
post Jan 4 2008, 12:53 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(clayclws @ Jan 3 2008, 11:49 AM)
The way you put it, they intentionally want to make GDDR4 looks bad...
*
Now i wonder why they would do that....
http://www.driverheaven.net/reviews/X1950XTXreview/GDDR4.pdf

wink.gif
clayclws
post Jan 4 2008, 12:56 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


Surely ATI is the promoter...but they do not pioneer it.
ikanayam
post Jan 4 2008, 01:02 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(clayclws @ Jan 3 2008, 11:56 AM)
Surely ATI is the promoter...but they do not pioneer it.
*
pg 2 of the pdf tells me they were a lot more than a promoter.
clayclws
post Jan 4 2008, 01:08 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(ikanayam @ Jan 4 2008, 01:02 AM)
pg 2 of the pdf tells me they were a lot more than a promoter.
*
I've seen it before but I've never felt that it would mean so much. GDDR5 is also designed under the same JC-42.3 subcommittee, lead still by Joe Macri. So, is NVIDIA going to abandon GDDR5 as well?

This post has been edited by clayclws: Jan 4 2008, 01:14 AM
ikanayam
post Jan 4 2008, 01:20 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(clayclws @ Jan 3 2008, 12:08 PM)
I've seen it before but I've never felt that it would mean so much. GDDR5 is also designed under the same JC-42.3 subcommittee, lead still by Joe Macri. So, is NVIDIA going to abandon GDDR5 as well?
*
I think they're not too pleased with GDDR5 as well, but i also think they've made their point clear enough the last round and they probably won't sacrifice their lead for politics.
clayclws
post Jan 4 2008, 01:23 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


Ah well...chairman does not mean much if the rest do not agree or work together. Since GDDR4 kinda flop...they wont repeat the same mistake. Here's hoping for GDDR5 in D9E. I think D9M and D9P will still use GDDR3.
raymond8341
post Jan 4 2008, 01:07 PM

Getting Started
**
Junior Member
53 posts

Joined: Sep 2006
From: Malaysia



NVIDIA's yet-to-be released GeForce 9800 GX2 in the "flesh." We reveal some of the specs and what should be expected.

The best way to think of the GeForce 9800 GX2 card is as an 8800 GPU that has been die shrunk to 65nm and placed in an SLI configuration in a "single" card. The 9800 GX2 is very reminiscent of 7950 GX2 of days past. (And we loved the 7950 GX2 at launch, but terrible support and diminishing returns soon painted it as one of NVIDIA's biggest failures since the 5800 series.)

The GeForce 9800 GX2 will launch in late February or early March as it now stands and will replace the 8800 Ultra (single GPU) card in NVIDIA's high-end product line up. The 9800 GX2 is said to be at least 30% faster than a 8800 Ultra. While it is not clear from the pictures below, we are told it will support "Quad SLI."

user posted image
GeForce 9800 GX2 Front



user posted image
GeForce 9800 GX2 Back




GeForce 9800 GX2 ::::

1. 1GB Frame Buffer

2. Two PCBs

3. Two 65nm GPUs Total

4. 256 Stream Processors Total

All of the information here comes from sources overseas that we consider trustworthy. Obviously there are some specifications that could be a bit more clear, especially on the memory bus. We would expect to see two 768MB frame buffers per GPU here, but that is not what is spelled out. We are guessing 512MB per GPU currently. Should we learn of any changes and/or corrections we will certainly update this page and inform our readers on our daily news page.




Source
Amal
post Jan 4 2008, 01:32 PM

Newbie
******
Senior Member
1,672 posts

Joined: Sep 2005


QUOTE(raymond8341 @ Jan 4 2008, 01:07 PM)
NVIDIA's yet-to-be released GeForce 9800 GX2 in the "flesh." We reveal some of the specs and what should be expected.

The best way to think of the GeForce 9800 GX2 card is as an 8800 GPU that has been die shrunk to 65nm and placed in an SLI configuration in a "single" card. The 9800 GX2 is very reminiscent of 7950 GX2 of days past. (And we loved the 7950 GX2 at launch, but terrible support and diminishing returns soon painted it as one of NVIDIA's biggest failures since the 5800 series.)

The GeForce 9800 GX2 will launch in late February or early March as it now stands and will replace the 8800 Ultra (single GPU) card in NVIDIA's high-end product line up. The 9800 GX2 is said to be at least 30% faster than a 8800 Ultra. While it is not clear from the pictures below, we are told it will support "Quad SLI."

user posted image
GeForce 9800 GX2 Front
user posted image
GeForce 9800 GX2 Back
GeForce 9800 GX2 ::::

  1. 1GB Frame Buffer

  2. Two PCBs

  3. Two 65nm GPUs Total

  4. 256 Stream Processors Total

All of the information here comes from sources overseas that we consider trustworthy. Obviously there are some specifications that could be a bit more clear, especially on the memory bus. We would expect to see two 768MB frame buffers per GPU here, but that is not what is spelled out. We are guessing 512MB per GPU currently. Should we learn of any changes and/or corrections we will certainly update this page and inform our readers on our daily news page.
Source
*
it doesnt seem soo fast.. hmm.gif
not only its a 9 series, but also a GX2! only 30% from 8800ultra??
not worth waiting then.. tongue.gif

This post has been edited by Amal: Jan 4 2008, 01:35 PM
Core_Tracer
post Jan 4 2008, 03:40 PM

4 Stars Prodigy
****
Senior Member
599 posts

Joined: Oct 2007
From: Finding the Core



wow...so cool......
hope it wont repeat the 7950GX2 failure......

Ezonizs
post Jan 4 2008, 03:46 PM

~*~Freak~*~
******
Senior Member
1,055 posts

Joined: Sep 2007
From: Subang Jaya



the picture of the card looks sooo yooonnnggg suuiii lol laugh.gif ugly
ianho
post Jan 4 2008, 03:53 PM

Cucimangkoklife
Group Icon
VIP
15,705 posts

Joined: Mar 2005
From: Tg. Rambutan
QUOTE(Core_Trace(R) @ Jan 4 2008, 03:40 PM)
wow...so cool......
hope it wont repeat the 7950GX2 failure......
*
7950GX2 was not a failure. It was a brilliant card. The failure was Quad SLI with 2 units of 7950GX2 running in SLI.
mADmAN
post Jan 4 2008, 04:00 PM

10k Club
********
All Stars
10,530 posts

Joined: Nov 2004
From: Petaling Jaya & Mid Valley


QUOTE(skylinegtr34rule4life @ Jan 4 2008, 03:02 PM)
wah why the card all got covered up like the hijjab style martyr laugh.gif this is like ISLAMIC card laugh.gif i prefer the back nekkid at least doh.gif doh.gif
*
post reported....ur remark on religion is not funny and i find it discriminatory and offensive.

hope to NOT see u around for a couple of days.
Hornet
post Jan 4 2008, 04:18 PM

What?
*******
Senior Member
4,251 posts

Joined: Jan 2003
From: Malacca, Malaysia, Earth


QUOTE(ianho @ Jan 4 2008, 03:53 PM)
7950GX2 was not a failure. It was a brilliant card. The failure was Quad SLI with 2 units of 7950GX2 running in SLI.
*
Many was pissed off after nVidia released the GeForce 8 not too long after they spend on 7950GX2

I'm somewhat disappointed that this is the same old GeForce 8800, and only 30% faster. Something is not right
Probably is just another push for GeForce 8 before finally moving on to a new range of GPU, which makes this GX2 a pointless card to spend on.

This post has been edited by Hornet: Jan 4 2008, 04:32 PM
t3chn0m4nc3r
post Jan 4 2008, 07:39 PM

Teh Necron Lord
*******
Senior Member
4,139 posts

Joined: Sep 2006
From: Internet


QUOTE(Hornet @ Jan 4 2008, 05:18 PM)
Many was pissed off after nVidia released the GeForce 8 not too long after they spend on 7950GX2

I'm somewhat disappointed that this is the same old GeForce 8800, and only 30% faster. Something is not right
Probably is just another push for GeForce 8 before finally moving on to a new range of GPU, which makes this GX2 a pointless card to spend on.
*
no choice... market demand more power... so they give us more power and demand more cash... laugh.gif
ianho
post Jan 4 2008, 07:39 PM

Cucimangkoklife
Group Icon
VIP
15,705 posts

Joined: Mar 2005
From: Tg. Rambutan
QUOTE(Hornet @ Jan 4 2008, 04:18 PM)
Many was pissed off after nVidia released the GeForce 8 not too long after they spend on 7950GX2

I'm somewhat disappointed that this is the same old GeForce 8800, and only 30% faster. Something is not right
Probably is just another push for GeForce 8 before finally moving on to a new range of GPU, which makes this GX2 a pointless card to spend on.
*
I was a 7950GX2 user too. I enjoyed it fully for about 6 months n then moved on to the 8800GTX as soon as it appeared. No need to be pissed as LYN Garage Sale is a wonderful place for chronic upgraders to sell off the stuff that they dont use after upgrading. thumbup.gif
clayclws
post Jan 4 2008, 08:21 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(ianho @ Jan 4 2008, 07:39 PM)
I was a 7950GX2 user too. I enjoyed it fully for about 6 months n then moved on to the 8800GTX as soon as it appeared. No need to be pissed as LYN Garage Sale is a wonderful place for chronic upgraders to sell off the stuff that they dont use after upgrading. thumbup.gif
*
Oh, yeah...I agree with you. The only place to trade off stuffs you don't want and get 1st/2nd hand stuffs cheaply. Sometimes, things you can't get on the market are available as well.

I don't think they will market 9800GX2 that fast.
Hyde`fK
post Jan 4 2008, 11:00 PM

D9s Killer
*******
Senior Member
2,378 posts

Joined: Jan 2003
From: Miri,Sarawak,Malaysia Status: Dead!



GeForce 9800 & 9600 specs and availability

1. GeForce 9800 GX2 : 2 x 65nm G92, 256 shader processors, 2 x 256-bit memory interface, 1GB memories, Feb-Mar

2. GeForce 9800 GTX : Replace 8800GTX, Feb-Mar

3. GeForce 9800 GT : Replace 8800 GT, Mar-Apr

4. GeForce 9600 GT : Launch on Feb 14th, 64 shader processors, 256-bit memory interface, 512MB memories, 8600GTS< 9600GT<8800GT, US$169


Source : http://forums.vr-zone.com/showthread.php?t=221448
Hornet
post Jan 5 2008, 11:19 AM

What?
*******
Senior Member
4,251 posts

Joined: Jan 2003
From: Malacca, Malaysia, Earth


QUOTE(Hyde`fK @ Jan 4 2008, 11:00 PM)
GeForce 9800 & 9600 specs and availability

1. GeForce 9800 GX2 : 2 x 65nm G92, 256 shader processors, 2 x 256-bit memory interface, 1GB memories, Feb-Mar

2. GeForce 9800 GTX : Replace 8800GTX, Feb-Mar

3. GeForce 9800 GT : Replace 8800 GT, Mar-Apr

4. GeForce 9600 GT : Launch on Feb 14th, 64 shader processors, 256-bit memory interface, 512MB memories, 8600GTS< 9600GT<8800GT, US$169
Source : http://forums.vr-zone.com/showthread.php?t=221448
*
Its funny how the put it that way as if G9800GX2 will be a monster lol. Basically, its still 512MB per core. If data has to be duplicated across both cores, the card will just have an effective size of 512MB. The effective memory bus is also 256bits only. Given the fact that its only 30% faster than an 8800Ultra (though this card has 2x SPs), i think it goes to show how inefficient the card is. It 2 PCB slap together, and probably aren't clocked at a very high frequency.

98 is just a new number for the 65nm die shrink. Its still the same 15 months old GPU.

I think nVidia are holding back the real next gen chips until ATi release their R700. Till then, its a boring 1H 2008.

89 Pages « < 10 11 12 13 14 > » Top
 

Change to:
| Lo-Fi Version
0.0249sec    0.87    6 queries    GZIP Disabled
Time is now: 13th December 2025 - 07:24 PM