Outline ·
[ Standard ] ·
Linear+
GeForce 9 series thread
|
Hornet
|
Dec 25 2007, 10:35 AM
|
|
QUOTE(jcliew @ Dec 25 2007, 09:27 AM) New 8800GT n 8800GTS G92 just launched recently now GeForce 9 series plan to launch by Feb 2008?? Cannot invest on graphic cards anymore liao since the life cycle that short  Those are just refreshment part. Its still the same GeForce 8, just with smaller fabrication process. A new architecture is long overdue. GeForce 8 has been in the market for more than a year now.
|
|
|
|
|
|
ikanayam
|
Dec 25 2007, 10:42 AM
|
|
QUOTE(Hornet @ Dec 24 2007, 09:35 PM) Those are just refreshment part. Its still the same GeForce 8, just with smaller fabrication process. A new architecture is long overdue. GeForce 8 has been in the market for more than a year now. It won't be a big change from the G8x. If people are expecting a G7x to G8x type of change, they will be very disappointed. The G8x is already a solid base for a dx10 chip, so there is little reason for them to break it. We probably won't see anything really new till dx11.
|
|
|
|
|
|
boblp
|
Dec 25 2007, 10:54 AM
|
|
QUOTE(ikanayam @ Dec 25 2007, 10:42 AM) It won't be a big change from the G8x. If people are expecting a G7x to G8x type of change, they will be very disappointed. The G8x is already a solid base for a dx10 chip, so there is little reason for them to break it. We probably won't see anything really new till dx11. by that time i hope im following the fast pace improvement in technologies
|
|
|
|
|
|
clayclws
|
Dec 25 2007, 09:37 PM
|
|
Seeing how far G92 jumped ahead of G80, I think D9P and E would not be a great improvement over D8P...significant yes, but wouldn't be able to play Crysis on 2560x1600 at full details with minimum 30fps. But still, improvement is still improvement. I hope the circulated 19th February release date is accurate. I can't wait any longer to play those games with high details...
|
|
|
|
|
|
LEVIATHAN
|
Dec 25 2007, 09:43 PM
|
|
upgrading for the sake of GTAIV. imma build my gaming rig by the end of next year. yeah.
|
|
|
|
|
|
clayclws
|
Dec 25 2007, 09:44 PM
|
|
GTAIV will not tax our system so much la.
|
|
|
|
|
|
Terence573
|
Dec 25 2007, 10:04 PM
|
|
I think I have to skip 9 to take X( 10 ).....wow nice naming for it..GeforceX.
|
|
|
|
|
|
clayclws
|
Dec 25 2007, 10:17 PM
|
|
Personally, I think they should rebrand their GeForce name when they reach 10...
|
|
|
|
|
|
spartacvs
|
Dec 25 2007, 10:26 PM
|
|
I never used any of the Nvidia GeForce garphic card before but I'm planning to get on in Feb 2008 though. I doubt I will go for the 9 series as it seems to be just way too high end, not to mentioned the price that follows it. My target is the 8800GT actually but I heard many recommended the GTS version instead. Is there any huge difference between the two? Merry Christmas Y'all!
|
|
|
|
|
|
S4PH
|
Dec 25 2007, 10:30 PM
|
|
Direct X 10 is still in the developing stage still young and nt mature yet dats why frame rate with DX10 sucks check out guru3d.com most of the benchmark are with Winxp with DX9 maybe need to wait till Vista SP2 is out then its time 2 port to DX10 with 4 Gigs of ram,so i think even with Geforce 9 series with old 8 series architecture will nt improve performance significantly.
|
|
|
|
|
|
clayclws
|
Dec 28 2007, 02:20 PM
|
|
I'm not sure this is true or not, but I'll just post for the sake of it  - Codenamed G100 <- Huh? Not D9E?- 65nm process - 256 shader processors <- Awesome~!- 780MHz core clock - 3200MHz memory clock - 512-bit memory width <- About time they start maximizing the utilization of bandwidth on PCIexpress~!- 2048MB (256X8) GDDR5 chips <- Overkill~!- GDDR5 @ 0.25-0.5ns <- Overkill~!- Dual DVI-out <- No HDMI or DisplayPort?- Supports DX 10.1, VP3 - 15-25% lower TDP than 8800GTS <- That's cool...SourceThis post has been edited by clayclws: Dec 28 2007, 02:21 PM
|
|
|
|
|
|
ikanayam
|
Dec 28 2007, 02:36 PM
|
|
This thread and the other need to be merged. The double posting is getting ridiculous.
|
|
|
|
|
|
clayclws
|
Dec 28 2007, 02:44 PM
|
|
Sorry for the double posting...I thought it's relevant...although in two different threads.
|
|
|
|
|
|
azamfuat
|
Dec 28 2007, 05:11 PM
|
New Member
|
waaaah. i can learn alot from u guys.
|
|
|
|
|
|
Breaktru
|
Dec 28 2007, 05:51 PM
|
|
IMO , it will be like the transition of nv40 to G70 , simply said 6800 to 7800 .
An improvement of the base architecture , and that's good enough to play Crysis already .
|
|
|
|
|
|
emy_xvidia
|
Dec 28 2007, 05:54 PM
|
|
QUOTE(clayclws @ Dec 28 2007, 02:20 PM) I'm not sure this is true or not, but I'll just post for the sake of it  - Codenamed G100 <- Huh? Not D9E?- 65nm process - 256 shader processors <- Awesome~!- 780MHz core clock - 3200MHz memory clock - 512-bit memory width <- About time they start maximizing the utilization of bandwidth on PCIexpress~!- 2048MB (256X8) GDDR5 chips <- Overkill~!- GDDR5 @ 0.25-0.5ns <- Overkill~!- Dual DVI-out <- No HDMI or DisplayPort?- Supports DX 10.1, VP3 - 15-25% lower TDP than 8800GTS <- That's cool...SourceOMG!  I hope the price wont be that ridiculous!  Added on December 28, 2007, 5:56 pmbut this is juz a rumoured spec, isn't it?? This post has been edited by emy_xvidia: Dec 28 2007, 05:56 PM
|
|
|
|
|
|
Terence573
|
Dec 28 2007, 07:54 PM
|
|
QUOTE(clayclws @ Dec 28 2007, 02:20 PM) I'm not sure this is true or not, but I'll just post for the sake of it  - Codenamed G100 <- Huh? Not D9E?- 65nm process - 256 shader processors <- Awesome~!- 780MHz core clock - 3200MHz memory clock - 512-bit memory width <- About time they start maximizing the utilization of bandwidth on PCIexpress~!- 2048MB (256X8) GDDR5 chips <- Overkill~!- GDDR5 @ 0.25-0.5ns <- Overkill~!- Dual DVI-out <- No HDMI or DisplayPort?- Supports DX 10.1, VP3 - 15-25% lower TDP than 8800GTS <- That's cool...SourceWow GDDR5!! If it is true then nvidia is making a leap from 3rd to 5th haha. Specs wise is awsome! Wish the price wont be too " Awesome "...
|
|
|
|
|
|
Godek
|
Dec 28 2007, 08:03 PM
|
|
hope ati can counter this hehe.A competition is good to us. Nuff said.
|
|
|
|
|
|
clayclws
|
Dec 29 2007, 05:45 AM
|
|
It should be just a speculation...They may use GDDR5...but 2GB is just overkill. Memory bandwidth and amount of shaders look promising though, but I wouldn't rely on the source. Still...looks damn good on paper...erm...screen...
|
|
|
|
|
|
gundamseedw
|
Dec 29 2007, 02:07 PM
|
Getting Started

|
wow ? GDDR4 not yet being widely use den here comes GDDR5
|
|
|
|
|