Outline ·
[ Standard ] ·
Linear+
GDDR4 - waiting for it?, Look again, it's GDDR5!!!!!
|
TSkmarc
|
Nov 3 2007, 05:36 PM
|
The future is here - Cryptocurrencies!
|
QUOTE(ianho @ Nov 3 2007, 05:20 PM) Wei, careful what u wish for. Supposing a new 1 comes out every year, and now already at GDDR5, u only haf 15 years left!  Won't be so fast la. If not because of Qimonda, GDDR4 would have been mainstream for another couple of years. By the time GDDR20 comes out, I'll probably be 60 years old. At that time, if I can game with GDDR20, I'll die a happy man!!! This post has been edited by kmarc: Nov 3 2007, 05:36 PM
|
|
|
|
|
|
verticalar
|
Nov 5 2007, 03:26 PM
|
|
very fast lo...RAM... ... wonder the next gen gfx card will have these babies...
|
|
|
|
|
|
ronho
|
Nov 7 2007, 11:10 PM
|
|
really wonder if mid end users can see the diff between ddr3 and ddr4...any of you guys actually tested these ddrs out to see the differences?? just wanted to know..as sometimes the suppliers promote but the diff not so much but users pay through the nose..
|
|
|
|
|
|
mgxbox
|
Nov 9 2007, 09:54 PM
|
|
QUOTE(ronho @ Nov 7 2007, 11:10 PM) really wonder if mid end users can see the diff between ddr3 and ddr4...any of you guys actually tested these ddrs out to see the differences?? just wanted to know..as sometimes the suppliers promote but the diff not so much but users pay through the nose.. If you look at the DDR3 & DDR4 you think it look ahead but the speed is not very much in different. It depend on many factor, not just the RAM itself. Example, an ATI 2600XT DDR4 lose to NVIDIA 8600GT DDR3 in most of the games benchmark so you know there is not much different actually, not now maybe 1 or 2 years later when those software made to recognizes it. If comparing DDR2 to DDR4, yes there is a lot of different.
|
|
|
|
|
|
t3chn0m4nc3r
|
Nov 9 2007, 10:56 PM
|
|
QUOTE(sHawTY @ Nov 3 2007, 05:11 PM) Haven't see any GC that uses GDDR5. Even finding GC's that uses GDDR4 is kinda hard.  err... really...? actually up to date games still lags in my current rig which i spent almost RM4k to build... if u r talking bout o9 games... please don compare cuz i can play them wif Intel GMA950... lowly hardware for lowly software...
|
|
|
|
|
|
khaidani
|
Nov 10 2007, 07:53 AM
|
|
seems ATi failed to benefits GDDR4 on their current gc? tried my friend's hd2600xt ddr4 but seems my x1950pro is much better
|
|
|
|
|
|
Createmous
|
Nov 12 2007, 11:42 PM
|
|
QUOTE(khaidani @ Nov 10 2007, 07:53 AM) seems ATi failed to benefits GDDR4 on their current gc? tried my friend's hd2600xt ddr4 but seems my x1950pro is much better Friends of mine also say that GDDR4 really not nesessary at this moment if not nvidia already use GDDR4 on it 8800GT.
|
|
|
|
|
|
besaid
|
Dec 3 2007, 04:23 PM
|
|
QUOTE(Createmous @ Nov 12 2007, 11:42 PM) Friends of mine also say that GDDR4 really not nesessary at this moment if not nvidia already use GDDR4 on it 8800GT. your friend...is an idiot.dont listen to him from now onwards... GeForce 8series Wiki
|
|
|
|
|
|
ikanayam
|
Dec 3 2007, 08:35 PM
|
|
QUOTE(besaid @ Dec 3 2007, 03:23 AM) your friend...is an idiot.dont listen to him from now onwards... GeForce 8series WikiSo... what's the wiki link for? Where does it prove your point?
|
|
|
|
|
|
TSkmarc
|
Dec 3 2007, 08:44 PM
|
The future is here - Cryptocurrencies!
|
QUOTE(Createmous @ Nov 12 2007, 11:42 PM) Friends of mine also say that GDDR4 really not nesessary at this moment if not nvidia already use GDDR4 on it 8800GT. Well, do you realize that quad core is also not necessary for most people at the moment? If not for folding, I wouldn't want a quad core now anyway..... Besides, what is the capability of current GDDR4 in ATI's 3870? Did somebody mentioned 2.6Ghz.....
|
|
|
|
|
|
SlayerXT
|
Dec 4 2007, 01:39 AM
|
|
Samsung already debut world's fastest GDDR5 can operates at whopping 24GBytes/s. Thats really crazy. By that time already can play crysis at 60fps++ ultra high.
|
|
|
|
|
|
Terence573
|
Dec 4 2007, 12:49 PM
|
|
Huhu why no DDR5 in RAM. Anyways DDR and GDDR is different. For solid evidence, ATI uses GDDR4 for their cards as Nvidia just stay with GDDR3.But the result is nvidia still pawn the GDDR4 in term of performance. But I'm not saying GDDR4 is not good.Just maybe the performance not optimized by ATi.If nvidia to use GDDR4 maybe is good.But considering nvidia haven't use a single GDDR4 for ther 8 series card,just make me wonder why...
|
|
|
|
|
|
InnerMax
|
Dec 4 2007, 12:55 PM
|
Getting Started

|
QUOTE(ikanayam @ Dec 3 2007, 08:35 PM) So... what's the wiki link for? Where does it prove your point?  maybe its a guide or somekind of enlightenment for some... errr.. i dunno, really whats the wiki for?
|
|
|
|
|
|
TSkmarc
|
Dec 5 2007, 07:56 PM
|
The future is here - Cryptocurrencies!
|
Hmmmm.... looks like Hynix is also skipping GDDR4 and jumping to GDDR5..... http://www.xbitlabs.com/news/memory/displa...mory_Chips.htmlQuote from above article "With its improved speed and power characteristics, GDDR5 is projected to succeed GDDR3 and dominate the graphics DRAM market from the second half of 2008."
|
|
|
|
|
|
ikanayam
|
Dec 5 2007, 07:58 PM
|
|
QUOTE(Terence573 @ Dec 3 2007, 11:49 PM) Huhu why no DDR5 in RAM. Anyways DDR and GDDR is different. For solid evidence, ATI uses GDDR4 for their cards as Nvidia just stay with GDDR3.But the result is nvidia still pawn the GDDR4 in term of performance. But I'm not saying GDDR4 is not good.Just maybe the performance not optimized by ATi.If nvidia to use GDDR4 maybe is good. But considering nvidia haven't use a single GDDR4 for ther 8 series card,just make me wonder why...It's for political reasons rather than technological.
|
|
|
|
|
|
Terence573
|
Dec 5 2007, 09:05 PM
|
|
QUOTE(ikanayam @ Dec 5 2007, 07:58 PM) It's for political reasons rather than technological. Political? How to say?
|
|
|
|
|
|
timljh
|
Dec 7 2007, 06:42 PM
|
Getting Started

|
QUOTE(Terence573 @ Dec 4 2007, 12:49 PM) Huhu why no DDR5 in RAM. Anyways DDR and GDDR is different. For solid evidence, ATI uses GDDR4 for their cards as Nvidia just stay with GDDR3.But the result is nvidia still pawn the GDDR4 in term of performance. But I'm not saying GDDR4 is not good.Just maybe the performance not optimized by ATi.If nvidia to use GDDR4 maybe is good.But considering nvidia haven't use a single GDDR4 for ther 8 series card,just make me wonder why... i suppose they are still new with the GDDR4 juz like they shift one 90nm to 65nm, it takes some times for them to be mature.. This post has been edited by timljh: Dec 7 2007, 06:46 PM
|
|
|
|
|
|
CYBERJUDGE
|
Dec 8 2007, 11:53 AM
|
|
samsung is the one who is releasing it... Change your mobo ... sell your gc....
|
|
|
|
|
|
TSkmarc
|
May 11 2008, 09:23 PM
|
The future is here - Cryptocurrencies!
|
Some updates on GDDR5 : GDDR5 in productionQUOTE Qimonda, a leading manufacturer of advanced dynamic random access memory (DRAM), said that it could deliver next-generation GDDR5 memory for graphics cards and other applications that require high memory bandwidth in volume. At this point Qimonda can supply makers of graphics boards GDDR5 memory with up to 4.50GHz clock-speed.
|
|
|
|
|
|
TSkmarc
|
Nov 25 2008, 12:12 PM
|
The future is here - Cryptocurrencies!
|
|
|
|
|
|