Here is something to get your mind scratching...
GeForce 9 series thread
GeForce 9 series thread
|
|
Dec 2 2007, 09:25 PM
Return to original view | Post
#1
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
Here is something to get your mind scratching...
|
|
|
|
|
|
Dec 4 2007, 10:39 PM
Return to original view | Post
#2
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
QUOTE(cscheat @ Dec 4 2007, 09:03 PM) Maybe...but most likely not. They haven't even embarked unto GDDR4. But some quarters do argue that since GDDR5 is already in stable production capacity, might as well jump directly to GDDR5 instead of GDDR4. My money is on GDDR4 though. |
|
|
Dec 6 2007, 11:16 AM
Return to original view | Post
#3
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
First batch of card? GeForce 9? Huh? First batch for DirectX 10.1? As opposed to GeForce 8 being first batch for DirectX 10? How much do you people know about DirectX 10 and 10.1?
|
|
|
Dec 11 2007, 08:02 AM
Return to original view | Post
#4
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
QUOTE(Faint @ Dec 11 2007, 02:01 AM) It's not a good idea to get 2 graphic cards at a time for SLI. Best to SLI (or CF) when the graphic card can't handle the most demanding game out there anymore...say around 6 months or more. Gamers with lots of moolah can change anytime they want. I don't know why it is important for them to brag about having the best of the best hardware available. Maybe it makes them proud...I game and do lots of graphical work on my comp, but I only upgrade when there's significant changes, say about 2 generations ahead (GF6 to GF8, etc.) |
|
|
Dec 11 2007, 11:54 PM
Return to original view | Post
#5
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
QUOTE(E-J@1 @ Dec 11 2007, 11:00 PM) just like buying those RM100k over car why not just buy a kancil, eventually both car will get u to the destination |
|
|
Dec 16 2007, 11:23 AM
Return to original view | Post
#6
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
I love PC gaming, but I do see the perks of console gaming as well. First and foremost, you don't need to upgrade. Secondly, the games are optimized for the system as the developers get more and more familiar with the console.
Still, I love Starcraft 2 on my PC and FPS with keyboard and mouse...not some auto aiming devices to help me frag... But all these are getting out of topic...D9M, D9P & D9E...not much news eh? |
|
|
|
|
|
Dec 16 2007, 02:34 PM
Return to original view | Post
#7
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
QUOTE(ikanayam @ Dec 16 2007, 01:03 PM) Most people won't get much reliable info until close to the launch, if looking at the trend since G80. They've done a lot of misdirection and to keep things seekrit lately. Yeah well...guess I'm gonna get a 8800GT or GTS anyway...not exactly sure if February launch date is real or not. |
|
|
Dec 25 2007, 09:37 PM
Return to original view | Post
#8
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
Seeing how far G92 jumped ahead of G80, I think D9P and E would not be a great improvement over D8P...significant yes, but wouldn't be able to play Crysis on 2560x1600 at full details with minimum 30fps. But still, improvement is still improvement. I hope the circulated 19th February release date is accurate. I can't wait any longer to play those games with high details...
|
|
|
Dec 25 2007, 09:44 PM
Return to original view | Post
#9
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
GTAIV will not tax our system so much la.
|
|
|
Dec 25 2007, 10:17 PM
Return to original view | Post
#10
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
Personally, I think they should rebrand their GeForce name when they reach 10...
|
|
|
Dec 28 2007, 02:20 PM
Return to original view | Post
#11
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
I'm not sure this is true or not, but I'll just post for the sake of it
- Codenamed G100 <- Huh? Not D9E? - 65nm process - 256 shader processors <- Awesome~! - 780MHz core clock - 3200MHz memory clock - 512-bit memory width <- About time they start maximizing the utilization of bandwidth on PCIexpress~! - 2048MB (256X8) GDDR5 chips <- Overkill~! - GDDR5 @ 0.25-0.5ns <- Overkill~! - Dual DVI-out <- No HDMI or DisplayPort? - Supports DX 10.1, VP3 - 15-25% lower TDP than 8800GTS <- That's cool... Source This post has been edited by clayclws: Dec 28 2007, 02:21 PM |
|
|
Dec 28 2007, 02:44 PM
Return to original view | Post
#12
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
Sorry for the double posting...I thought it's relevant...although in two different threads.
|
|
|
Dec 29 2007, 05:45 AM
Return to original view | Post
#13
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
It should be just a speculation...They may use GDDR5...but 2GB is just overkill. Memory bandwidth and amount of shaders look promising though, but I wouldn't rely on the source. Still...looks damn good on paper...erm...screen...
|
|
|
|
|
|
Jan 2 2008, 08:47 PM
Return to original view | Post
#14
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
|
|
|
Jan 3 2008, 01:32 PM
Return to original view | Post
#15
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
Jen-Hsun Huang is Taiwanese...and they did release GeForce 4 didn't they? GDDR5 is simply a much cheaper move rather than GDDR4. Ikanayam mentioned that there's some political stuff involved with why NVIDIA ain't shifting towards GDDR4...you'll need to ask him more on that.
|
|
|
Jan 3 2008, 02:28 PM
Return to original view | Post
#16
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
QUOTE(ikanayam @ Jan 3 2008, 02:16 PM) Err...mistake. Much rationale move in terms of money. GDDR4 is not that much of an improvement over GDDR3 but GDDR5 is, since both Qimonda and Samsung managed to achieve the specs of GDDR5 while manufacturing GDDR4.This post has been edited by clayclws: Jan 3 2008, 02:28 PM |
|
|
Jan 3 2008, 02:45 PM
Return to original view | Post
#17
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
QUOTE(ikanayam @ Jan 3 2008, 02:39 PM) GDDR3->GDDR4 is as big an improvement as GDDR4->GDDR5. However the fact that nvidia is strongly against it likely affected GDDR4 development, while it helped push GDDR3 speeds higher and sped up development of GDDR5. No one wants to bother too much with a product if there isn't much volume, which is why you see some manufacturers skipping GDDR4 completely. That's the reason why it is much rationale move in terms of money (mistakenly for much cheaper): volume. So, why is it that NVIDIA does not support GDDR4? Is AMD going to support GDDR5?edit: It is also very likely that the NV chips since G80 have GDDR4 support in case they needed it to be competitive, but they never needed it because things worked out as they wanted. |
|
|
Jan 3 2008, 09:50 PM
Return to original view | Post
#18
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
I think he meant to say that GDDR4 was only mass subscribed by AMD.ATI and GDDR5 will be mass subscribed by AMD.ATI AND NVIDIA (not sure about Intel with their new graphics technology). So essentially, it will be cheaper faster than GDDR4 was. I reckon AMD.ATI will stick with GDDR4 for the time bieng while NVIDIA may use GDDR3 first before jumping into GDDR5 for GeForce 9. Just a thought...not a fact.
Added on January 3, 2008, 9:51 pmAnd I still can't figure out the reasoning behind NVIDIA not using GDDR4. This post has been edited by clayclws: Jan 3 2008, 09:51 PM |
|
|
Jan 3 2008, 11:50 PM
Return to original view | Post
#19
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
Also, maybe because there is one less GDDR4 RAM manufacturer - Qimonda - so the price for GDDR4 was not competitive enough for NVIDIA to consider.
|
|
|
Jan 4 2008, 12:22 AM
Return to original view | Post
#20
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
QUOTE(emy_xvidia @ Jan 3 2008, 11:59 PM) not sure about this but possible.. i guess Nvidia wants to produce GDDR5-based cards so badly to force people to abandon the GDDR4-based cards which ATI is currently selling.. lol.. GDDR4 is better than GDDR3 but no, NVIDIA didn't use it. Nor the thousands of people using 8800GT, GTS and GTX.obviously people will prefer something faster and newer rite? QUOTE(skylinegtr34rule4life @ Jan 4 2008, 12:00 AM) How sure are you that NVIDIA is playing tricks?QUOTE(emy_xvidia @ Jan 4 2008, 12:02 AM) yeah, DirectX 10 based games is hardly playable with current generation of cards, then the 10.1 is out already.. Bioshock is very playable with full details on 1920x1200 with a 8800GT. |
| Change to: | 0.0216sec
0.74
7 queries
GZIP Disabled
Time is now: 11th December 2025 - 04:25 AM |