Welcome Guest ( Log In | Register )

3 Pages  1 2 3 >Bottom

Outline · [ Standard ] · Linear+

 GeForce 9 series thread

views
     
clayclws
post Dec 2 2007, 09:25 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


Here is something to get your mind scratching...
clayclws
post Dec 4 2007, 10:39 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(cscheat @ Dec 4 2007, 09:03 PM)
im sure the new 9 series will have GDDR5
*
Maybe...but most likely not. They haven't even embarked unto GDDR4. But some quarters do argue that since GDDR5 is already in stable production capacity, might as well jump directly to GDDR5 instead of GDDR4. My money is on GDDR4 though.
clayclws
post Dec 6 2007, 11:16 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


First batch of card? GeForce 9? Huh? First batch for DirectX 10.1? As opposed to GeForce 8 being first batch for DirectX 10? How much do you people know about DirectX 10 and 10.1?
clayclws
post Dec 11 2007, 08:02 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(Faint @ Dec 11 2007, 02:01 AM)
3 or 6 months ? If the user buy 2 x 8800 Ultra still need to change after 6 months ?
*
It's not a good idea to get 2 graphic cards at a time for SLI. Best to SLI (or CF) when the graphic card can't handle the most demanding game out there anymore...say around 6 months or more. Gamers with lots of moolah can change anytime they want. I don't know why it is important for them to brag about having the best of the best hardware available. Maybe it makes them proud...

I game and do lots of graphical work on my comp, but I only upgrade when there's significant changes, say about 2 generations ahead (GF6 to GF8, etc.)
clayclws
post Dec 11 2007, 11:54 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(E-J@1 @ Dec 11 2007, 11:00 PM)
laugh.gif those ppl with moolah upgrade often to satisfy personal needs

just like buying those RM100k over car

why not just buy a kancil, eventually both car will get u to the destination wink.gif
*
It's different analogy...Normally, people who buys BMW, Merc, VW, Audi, Porsche, Ferrari, Bentley, Rolls Royce, etc. will not replace their cars everytime there is a newer and better version of it. If they do, show me the daughter or granddaughter, and I'm going to court her tongue.gif Money face...
clayclws
post Dec 16 2007, 11:23 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


I love PC gaming, but I do see the perks of console gaming as well. First and foremost, you don't need to upgrade. Secondly, the games are optimized for the system as the developers get more and more familiar with the console.

Still, I love Starcraft 2 on my PC and FPS with keyboard and mouse...not some auto aiming devices to help me frag...

But all these are getting out of topic...D9M, D9P & D9E...not much news eh?
clayclws
post Dec 16 2007, 02:34 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(ikanayam @ Dec 16 2007, 01:03 PM)
Most people won't get much reliable info until close to the launch, if looking at the trend since G80. They've done a lot of misdirection and to keep things seekrit lately.
*
Yeah well...guess I'm gonna get a 8800GT or GTS anyway...not exactly sure if February launch date is real or not.
clayclws
post Dec 25 2007, 09:37 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


Seeing how far G92 jumped ahead of G80, I think D9P and E would not be a great improvement over D8P...significant yes, but wouldn't be able to play Crysis on 2560x1600 at full details with minimum 30fps. But still, improvement is still improvement. I hope the circulated 19th February release date is accurate. I can't wait any longer to play those games with high details...
clayclws
post Dec 25 2007, 09:44 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


GTAIV will not tax our system so much la.
clayclws
post Dec 25 2007, 10:17 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


Personally, I think they should rebrand their GeForce name when they reach 10...
clayclws
post Dec 28 2007, 02:20 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


I'm not sure this is true or not, but I'll just post for the sake of it wink.gif

- Codenamed G100 <- Huh? Not D9E?
- 65nm process
- 256 shader processors <- Awesome~!
- 780MHz core clock
- 3200MHz memory clock
- 512-bit memory width <- About time they start maximizing the utilization of bandwidth on PCIexpress~!
- 2048MB (256X8) GDDR5 chips <- Overkill~!
- GDDR5 @ 0.25-0.5ns <- Overkill~!
- Dual DVI-out <- No HDMI or DisplayPort?
- Supports DX 10.1, VP3
- 15-25% lower TDP than 8800GTS <- That's cool...

Source

This post has been edited by clayclws: Dec 28 2007, 02:21 PM
clayclws
post Dec 28 2007, 02:44 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


Sorry for the double posting...I thought it's relevant...although in two different threads.
clayclws
post Dec 29 2007, 05:45 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


It should be just a speculation...They may use GDDR5...but 2GB is just overkill. Memory bandwidth and amount of shaders look promising though, but I wouldn't rely on the source. Still...looks damn good on paper...erm...screen...
clayclws
post Jan 2 2008, 08:47 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(§layerXT @ Jan 2 2008, 08:37 PM)
Its very bad if mid range 9series will pawn my 8800GT. I will cry.gif again.
*
Depending on what you mean by midrange. If 9600GT, I don't think it'll beat your 8800GT. If it's 9800GT, then it most probably would.
clayclws
post Jan 3 2008, 01:32 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


Jen-Hsun Huang is Taiwanese...and they did release GeForce 4 didn't they? GDDR5 is simply a much cheaper move rather than GDDR4. Ikanayam mentioned that there's some political stuff involved with why NVIDIA ain't shifting towards GDDR4...you'll need to ask him more on that.
clayclws
post Jan 3 2008, 02:28 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(ikanayam @ Jan 3 2008, 02:16 PM)
Maybe i missed something, but i don't see where this "much cheaper" idea is coming from...
*
Err...mistake. Much rationale move in terms of money. GDDR4 is not that much of an improvement over GDDR3 but GDDR5 is, since both Qimonda and Samsung managed to achieve the specs of GDDR5 while manufacturing GDDR4.

This post has been edited by clayclws: Jan 3 2008, 02:28 PM
clayclws
post Jan 3 2008, 02:45 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(ikanayam @ Jan 3 2008, 02:39 PM)
GDDR3->GDDR4 is as big an improvement as GDDR4->GDDR5. However the fact that nvidia is strongly against it likely affected GDDR4 development, while it helped push GDDR3 speeds higher and sped up development of GDDR5. No one wants to bother too much with a product if there isn't much volume, which is why you see some manufacturers skipping GDDR4 completely.

edit: It is also very likely that the NV chips since G80 have GDDR4 support in case they needed it to be competitive, but they never needed it because things worked out as they wanted.
*
That's the reason why it is much rationale move in terms of money (mistakenly for much cheaper): volume. So, why is it that NVIDIA does not support GDDR4? Is AMD going to support GDDR5?
clayclws
post Jan 3 2008, 09:50 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


I think he meant to say that GDDR4 was only mass subscribed by AMD.ATI and GDDR5 will be mass subscribed by AMD.ATI AND NVIDIA (not sure about Intel with their new graphics technology). So essentially, it will be cheaper faster than GDDR4 was. I reckon AMD.ATI will stick with GDDR4 for the time bieng while NVIDIA may use GDDR3 first before jumping into GDDR5 for GeForce 9. Just a thought...not a fact.


Added on January 3, 2008, 9:51 pmAnd I still can't figure out the reasoning behind NVIDIA not using GDDR4.

This post has been edited by clayclws: Jan 3 2008, 09:51 PM
clayclws
post Jan 3 2008, 11:50 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


Also, maybe because there is one less GDDR4 RAM manufacturer - Qimonda - so the price for GDDR4 was not competitive enough for NVIDIA to consider.
clayclws
post Jan 4 2008, 12:22 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(emy_xvidia @ Jan 3 2008, 11:59 PM)
not sure about this but possible.. i guess Nvidia wants to produce GDDR5-based cards so badly to force people to abandon the GDDR4-based cards which ATI is currently selling.. lol..  laugh.gif

obviously people will prefer something faster and newer rite?  brows.gif  tongue.gif
*
GDDR4 is better than GDDR3 but no, NVIDIA didn't use it. Nor the thousands of people using 8800GT, GTS and GTX.

QUOTE(skylinegtr34rule4life @ Jan 4 2008, 12:00 AM)
another dirty tricks by NVIDIA again laugh.gif but why DX10 not DX10.1 doh.gif
*
How sure are you that NVIDIA is playing tricks?

QUOTE(emy_xvidia @ Jan 4 2008, 12:02 AM)
yeah, DirectX 10 based games is hardly playable with current generation of cards, then the 10.1 is out already..  rclxub.gif  what a crap..  doh.gif
*
Bioshock is very playable with full details on 1920x1200 with a 8800GT.

3 Pages  1 2 3 >Top
 

Change to:
| Lo-Fi Version
0.0216sec    0.74    7 queries    GZIP Disabled
Time is now: 11th December 2025 - 04:25 AM