Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

 GeForce 9 series thread

views
     
emy_xvidia
post Dec 28 2007, 05:54 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(clayclws @ Dec 28 2007, 02:20 PM)
I'm not sure this is true or not, but I'll just post for the sake of it wink.gif

- Codenamed G100 <- Huh? Not D9E?
- 65nm process
- 256 shader processors <- Awesome~!
- 780MHz core clock
- 3200MHz memory clock
- 512-bit memory width <- About time they start maximizing the utilization of bandwidth on PCIexpress~!
- 2048MB (256X8) GDDR5 chips <- Overkill~!
- GDDR5 @ 0.25-0.5ns <- Overkill~!
- Dual DVI-out <- No HDMI or DisplayPort?
- Supports DX 10.1, VP3
- 15-25% lower TDP than 8800GTS <- That's cool...

Source
*
OMG! drool.gif I hope the price wont be that ridiculous! sweat.gif


Added on December 28, 2007, 5:56 pmbut this is juz a rumoured spec, isn't it?? sweat.gif

This post has been edited by emy_xvidia: Dec 28 2007, 05:56 PM
emy_xvidia
post Dec 29 2007, 02:23 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(gundamseedw @ Dec 29 2007, 02:07 PM)
wow ? GDDR4 not yet being widely use den here comes GDDR5
*
it's because GDDR5 is much more cheaper than GDDR4, yet more faster..
emy_xvidia
post Jan 2 2008, 04:38 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
then go someone kind enough to sponsor u a card..

or better and the best way, go earn urself some money.. smile.gif
emy_xvidia
post Jan 2 2008, 06:56 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(X.E.D @ Jan 2 2008, 06:27 PM)
GTA IV is not coming to the PC. Rockstar's RAGE technology is not ported, and should never be.
*
are u sure?got any source? GTA was originally built for PC, now Rockstar has forgotten PC already.. what a crap from them.. mad.gif
emy_xvidia
post Jan 3 2008, 06:05 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(ikanayam @ Jan 3 2008, 02:16 PM)
Maybe i missed something, but i don't see where this "much cheaper" idea is coming from...
*
GDDR5 should be cheaper coz it's made on a lower production process than GDDR4, expected to be 4 times faster than GDDR3 and uses 20% lower in its power consumption..

i guess since Samsung and Qimonda has already developed the GDDR5, i can see why Nvidia wants to jump directly to GDDR5.. smile.gif
emy_xvidia
post Jan 3 2008, 06:29 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(ikanayam @ Jan 3 2008, 06:19 PM)
Smaller processes don't necessarily make it cheaper, especially not initially, and since they'll probably ship in larger capacities. Newer faster memory will always command a price premium, and this will remain so in 2008. So i don't see the cheaper cards going to this memory type yet because it's not a cost savings for them.
It will end up about 3-4 times faster, max, and that power consumption figure is not useful without knowing what speed it was compared with. You won't magically get 4 times faster right off the bat, that will only be as it matures, closer to the end of its life cycle.
*
yeah, agreed with what u say there, but the mass production of GDDR5 by Samsung and Qimonda will be one of the reason why it will be cheaper than GDDR4..

GDDR4 was first used on the X1950XTX in around August 2007.. now it's being used too but only by fewer models of ATI series.. the need for GDDR4 is not that high, and now since Nvidia has already made its decision to jump to GDDR5, it will, if not all, affect the price for GDDR4 as well as the GDDR5 itself..

Nvidia influences the market so greatly because it holds a greater proportion of the gfx current market, if i'm not mistaken, in the ratio of 7:3, which was originally by 9:1 because of their success in their 8 series currently..

This post has been edited by emy_xvidia: Jan 3 2008, 06:35 PM
emy_xvidia
post Jan 3 2008, 08:50 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(ikanayam @ Jan 3 2008, 08:43 PM)
Seems you're repeating what i said earlier without proper understanding. It's not going to be cheaper than GDDR4 right off the bat either. GDDR5 will likely command a price premium over other memory types throughout most if not all of 2008. Once again, price is not the main reason they're jumping to it, to make things absolutely clear.
*
i'm not saying they jump directly to GDDR5 because of the price alone.. Hynix, Samsung and Qimonda has already in the state of mass production of the GDDR5, it means the technology is already there, then why must they have still to be bounded by GDDR4 rite? smile.gif
emy_xvidia
post Jan 3 2008, 11:40 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(ikanayam @ Jan 3 2008, 08:56 PM)
It's a cost vs benefit thing. It may be cheaper to go with wider buses and slower memory. You don't just jump to the newest most expensive things just because they're "already there". Nvidia doesn't even jump to the newest process technology as soon as it's there. They prefer to stay slightly behind because it's cheaper and less problematic. Let the early adopters deal with the problems first.
*
It might be better then to go for wider buses and faster memory rite? and yes, it's about the GDDR5 is already there.. like Qimonda, they skip the GDDR4 because they see it as a niche product.. it would be a loss for them to produce GDDR4 while Samsung, its main competitor has already announced the GDDR5 memory.. the life cycle of GDDR4 wont be that long, we can see them (Hynix, Samsung, Qimonda) have already made to public about its GDDR5.. yes it will be initially expensive but in large amount of quantities, it will be cheaper..

and GDDR4 memory speed isnt that impressive as you can see it on most ATi products that use GDDR4, the speed doesn't differ much from optimized GDDR3.. Nvidia plans to abandon it because of Qimonda announced the GDDR3 with able-to-achieve clock of 1GHz, almost equal to GDDR4 memory and maybe even more than that, expected to able to reach as high as 1.2GHz (2.4GHz effective)... the need of GDDR4 is lesser coz it's being used by ATI and the market share for ATI cards isn't that impressive either..

QUOTE(clayclws @ Jan 3 2008, 09:50 PM)
I think he meant to say that GDDR4 was only mass subscribed by AMD.ATI and GDDR5 will be mass subscribed by AMD.ATI AND NVIDIA (not sure about Intel with their new graphics technology). So essentially, it will be cheaper faster than GDDR4 was. I reckon AMD.ATI will stick with GDDR4 for the time bieng while NVIDIA may use GDDR3 first before jumping into GDDR5 for GeForce 9. Just a thought...not a fact.


Added on January 3, 2008, 9:51 pmAnd I still can't figure out the reasoning behind NVIDIA not using GDDR4.
*
there are no clear reasons why they wanna skip the GDDR4, but we can guess it why from the benefits of performance and manufacturing costs of both GDDR4 and GDDR5 as announced by major memory manufacturers..
emy_xvidia
post Jan 3 2008, 11:59 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(clayclws @ Jan 3 2008, 11:50 PM)
Also, maybe because there is one less GDDR4 RAM manufacturer - Qimonda - so the price for GDDR4 was not competitive enough for NVIDIA to consider.
*
not sure about this but possible.. i guess Nvidia wants to produce GDDR5-based cards so badly to force people to abandon the GDDR4-based cards which ATI is currently selling.. lol.. laugh.gif

obviously people will prefer something faster and newer rite? brows.gif tongue.gif
emy_xvidia
post Jan 4 2008, 12:02 AM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(skylinegtr34rule4life @ Jan 4 2008, 12:00 AM)
another dirty tricks by NVIDIA again laugh.gif but why DX10 not DX10.1 doh.gif
*
yeah, DirectX 10 based games is hardly playable with current generation of cards, then the 10.1 is out already.. rclxub.gif what a crap.. doh.gif
emy_xvidia
post Jan 5 2008, 03:46 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(ikanayam @ Jan 4 2008, 12:45 AM)
No point having more bandwidth than you can use, external bandwidth is expensive. R600 was a good example of too much bandwidth. You always want maximum utilization of external bandwidth because it is expensive.

Nvidia is not skipping GDDR4 for technical or cost reasons, as i have said before. The relatively slow development of GDDR4, the long life and extended development of GDDR3, and the quicker move to GDDR5 is a result of nv skipping it rather than the reason they are skipping it.
*
that's what i've been trying to say.. sweat.gif

i personally think those are the reasons they are skipping it.. smile.gif


Added on January 5, 2008, 3:50 pmuser posted image

at least 580W of PSU needed.. do u think these beasts are wise in power consumption? laugh.gif

This post has been edited by emy_xvidia: Jan 5 2008, 03:50 PM
emy_xvidia
post Jan 5 2008, 04:11 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(ianho @ Jan 5 2008, 04:08 PM)
Wah! My wet dream.............................if they deliver on the Quad drivers this time. Last time I waited for the Quad drivers for 7950GX2 until neck oso long.
*
lol.. your wet dream will soon be reality.. you are sponsored right? brows.gif laugh.gif

i'm not very sure about the driver though.. i thought last time Quad-SLI was a failure.. sweat.gif
emy_xvidia
post Jan 6 2008, 07:49 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(Hornet @ Jan 6 2008, 10:32 AM)
...

Expected to be 30% faster than Ultra
Ultra price may drop
*
Ultra wont drop that much.. it still got its value for users who wanna do tri-SLI..

gotta compared 1st how the new Quad-9800GX2 performs side by side with tri-SLI 8800 Ultra...

This post has been edited by emy_xvidia: Jan 6 2008, 07:49 PM
emy_xvidia
post Jan 17 2008, 07:12 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
1.1K? should be getting 8800GTS instead.. brows.gif
emy_xvidia
post Jan 21 2008, 08:00 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(billytong @ Jan 21 2008, 03:46 PM)
I'm more interested to see 9600GT with DDR4 by some vendor. Something like the 2600XT DDR4.  brows.gif
*
which brand produces 96GT with DDR4, mind to tell? smile.gif
emy_xvidia
post Jan 24 2008, 08:18 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(NetRaid @ Jan 24 2008, 05:35 PM)
Yup, good one akachester!

Being an NVidia disty, this is our understanding of the NVidia VGA current positions:

1. 9800GX2 and 9600GT are NOT meant to replace any cards. They are additions to the line.

2. This is probably what the line would look like after Feb 2008: 9800GX2, 8800GTS, 8800GT, 9600GT, 8600GT DDR3 and so on...

Hope this clarifies some of the concerns here. So guys, 8800GTS and 8800GT are still good buy for price/performance.

All the best.
*
the 9800GX2 is meant to be the replacement of the 8800 Ultra..

no more Ultra after this coz Nvidia must be sure to focus on selling the sandwich card as many as possible..

read here...
emy_xvidia
post Jan 28 2008, 04:26 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(akachester @ Jan 28 2008, 02:43 PM)
LOL..I am not doubting but it only seems weird to me for them to launch the GS just for a couple of months and then discontinuing them. Oh well...The 9600GT does look interesting.  smile.gif
*
the GS is actually intended for the OEM market.. but due to profit-minded company which is none other than Nvidia, they do supply some certain manufacturers (Asus, Palit, XFX) the 88GS chips to be sold under retail products..
emy_xvidia
post Feb 12 2008, 02:44 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(storm88 @ Feb 12 2008, 12:41 PM)
adui...
why everyone so eager to say"
save ur $ for 88GT and go for 9600GT" ??

later when those who wanted to buy 88GT but bought 96GT later
i bet u ALL will cry.
*
let them cry lol since they still wanna get the 'fact' straight that 96GT is more powerful than the 88GT'.. doh.gif
emy_xvidia
post Feb 12 2008, 04:56 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(skylinegtr34rule4life @ Feb 12 2008, 04:22 PM)
wow i was expecting a thin roti prata shape 8800GT laugh.gif this custom baby looks like 7900GT rclxms.gif
*
that's the Asus custom-made Glaciator cooler.. to the owner, care to post its idle and load temp?
emy_xvidia
post Feb 12 2008, 05:12 PM

Look at all my stars!!
*******
Senior Member
2,735 posts

Joined: Mar 2006
From: Malaysia - Swindon Town
QUOTE(iman_210 @ Feb 12 2008, 05:07 PM)
nicey...what reso r u on? AA on?

and 38 celcius on idle temp...thats cool...
*
as expected from the Glaciator heatsink lorhh.. wonder what's the load temp.. brows.gif

2 Pages  1 2 >Top
 

Change to:
| Lo-Fi Version
0.0392sec    0.78    7 queries    GZIP Disabled
Time is now: 9th December 2025 - 08:55 PM