Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

 GDDR4 - waiting for it?, Look again, it's GDDR5!!!!!

views
     
TSkmarc
post Jun 8 2007, 11:47 PM, updated 18y ago

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



Looks like "Qimonda" is skipping GDDR4 and going straight to GDDR5!!! rclxms.gif

Volume production expected in first half of 2008!!! rclxm9.gif

So, are we going to wait some more?? rclxub.gif rclxub.gif

http://www.digitimes.com/news/a20070607PD215.html
ikanayam
post Jun 8 2007, 11:55 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

Wait for what? Does it make an actual difference to the end user what TYPE of memory is used?
goldfries
post Jun 8 2007, 11:57 PM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




not something i'd bother. as long as i get 20 - 30 fps, i'm good.
SUSdattebayo
post Jun 9 2007, 12:02 AM

Look at all my stars!!
*******
Senior Member
5,366 posts

Joined: Aug 2005


would PCIE 1.1 limited bandwidth causes bottleneck in utilizing gDDR5?
karhoe
post Jun 9 2007, 12:13 AM

Look at all my stars!!
*******
Senior Member
6,238 posts

Joined: Sep 2005
From: Kuala Lumpur


No idea, but there has been a rumour long ago that PCI-e2 is coming soon
Najmods
post Jun 9 2007, 12:27 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


Its good, for 128-bit memory bus width mid range cards, for high end card like HD 2900XT, I don't think it needs a faster memory. Probably on future faster cards might need it but still too early, far too few cards utilising even GDDR-4
taxidoor
post Jun 9 2007, 02:16 AM

Regular
******
Senior Member
1,008 posts

Joined: Mar 2006
From: Kuantan Pahang



if the cost par with DDR4 why dun use GDDR5 ? buyer also will few nicer "P
jinaun
post Jun 9 2007, 09:15 AM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(ikanayam @ Jun 8 2007, 11:55 PM)
Wait for what? Does it make an actual difference to the end user what TYPE of memory is used?
*
LOL..

feel good factor for end users...

eg.. mine is GDDR5, yours are GDDR4

mine pwned urs


LOL
TSkmarc
post Jun 9 2007, 10:04 AM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(ikanayam @ Jun 8 2007, 11:55 PM)
Wait for what? Does it make an actual difference to the end user what TYPE of memory is used?
*
Eh? This question is like asking whether there's any difference in using DDR and DDR2..... rclxub.gif

QUOTE(jinaun @ Jun 9 2007, 09:15 AM)
LOL..

feel good factor for end users...

eg.. mine is GDDR5, yours are GDDR4

mine pwned urs
LOL
*
Ya ya..... if one card offers GDDR4 and the other offers GDDR5, I'd probably go for GDDR5!!! Of course, have to look at the cost also....
ikanayam
post Jun 9 2007, 11:11 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(kmarc @ Jun 8 2007, 09:04 PM)
Eh? This question is like asking whether there's any difference in using DDR and DDR2.....  rclxub.gif
Ya ya..... if one card offers GDDR4 and the other offers GDDR5, I'd probably go for GDDR5!!! Of course, have to look at the cost also....
*
Is there really? Does the end user even know what difference it makes besides the 5 is bigger than 4? Is 5 a guarantee of better performance than 4? Memory bandwidth is a function of bus width which is independent of the model number and frequency which is to some extent helped by new models. Also the newer types tend to have a higher latency at the same clocks, so they have to be clocked much higher before you can see most of the performance benefit. Remember the gddr2 radeon 9800pro which performed worse than the gddr1 version? wink.gif

GDDR4 isn't even that widely used yet. Top end cards are still using gddr3. So this GDDR5 thing is still a way off. I wouldn't hold my breath waiting for it.

This post has been edited by ikanayam: Jun 9 2007, 11:12 AM
lamely_named
post Jun 9 2007, 11:26 AM

I got younger. ROLLZ.
******
Senior Member
1,931 posts

Joined: Jan 2003
From: Human Mixbreeding Farm

I have a noob theory.

The greedy manufacturers have already developed GDDR 10 since year 2000, and they are holding it back, releasing one version at a time.

and somehow this = profit!!!

win??

wait, that means intel already have tacheon flux processors and AMD's orbital mars space station is already developing a time distortion processors, along with ATI's 100% realer than life photorealistic GPU codenamed "Real graphic x696969xtx".

OMG!!! Them greedy bastart!!!

I think i'll just stick with my GDDR3, coz I'm a stupid lamb.



ikanayam
post Jun 9 2007, 11:27 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

That theory doesnt work, unless there was no competition in the industry.
TSkmarc
post Jun 9 2007, 12:33 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(ikanayam @ Jun 9 2007, 11:11 AM)
Is there really? Does the end user even know what difference it makes besides the 5 is bigger than 4? Is 5 a guarantee of better performance than 4? Memory bandwidth is a function of bus width which is independent of the model number and frequency which is to some extent helped by new models. Also the newer types tend to have a higher latency at the same clocks, so they have to be clocked much higher before you can see most of the performance benefit. Remember the gddr2 radeon 9800pro which performed worse than the gddr1 version? wink.gif

GDDR4 isn't even that widely used yet. Top end cards are still using gddr3. So this GDDR5 thing is still a way off. I wouldn't hold my breath waiting for it.
*
Your arguments are somewhat flawed.

Firstly, you can't talk about whether an end user can see or feel any difference between GDDR4 or GDDR5 besides the number. For the average end user, maybe so, but it is like saying that since they are only average user, why not use DDR and not DDR2/DDR3 as they won't feel the difference anyway. If you say that DDR2 is getting cheaper compared to DDR, then the same can be said for GDDR5 compared to GDDR4!

Secondly, the difference to an overclocker is important, eventhough it may be just a minor increase. Let's say that GDDR5 is only 10 fps faster than GDDR4 in real-world games, it is still better for an overclocker!! Again, if you compare the speed that DDR2 gives you as compared to DDR1, is there really any huge gains?

Thirdly, your arguments are only speculations. You do not know how GDDR5 will perform as compared to GDDR4. I mean, what if GDDR5 is 50% faster than GDDR4? What then?

Lastly, I'm not into debating about DDR2 vs DDR, just taking it as an example/analogy. Technology is advancing and we welcome them with open arms as the end results are still for the benefit of us end users anyway!!! icon_rolleyes.gif


ikanayam
post Jun 9 2007, 12:39 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(kmarc @ Jun 8 2007, 11:33 PM)
Your arguments are somewhat flawed.

Firstly, you can't talk about whether an end user can see or feel any difference between GDDR4 or GDDR5 besides the number. For the average end user, maybe so, but it is like saying that since they are only average user, why not use DDR and not DDR2/DDR3 as they won't feel the difference anyway. If you say that DDR2 is getting cheaper compared to DDR, then the same can be said for GDDR5 compared to GDDR4!

Secondly, the difference to an overclocker is important, eventhough it may be just a minor increase. Let's say that GDDR5 is only 10 fps faster than GDDR4 in real-world games, it is still better for an overclocker!! Again, if you compare the speed that DDR2 gives you as compared to DDR1, is there really any huge gains?

Thirdly, your arguments are only speculations. You do not know how GDDR5 will perform as compared to GDDR4. I mean, what if GDDR5 is 50% faster than GDDR4? What then?

Lastly, I'm not into debating about DDR2 vs DDR, just taking it as an example/analogy. Technology is advancing and we welcome them with open arms as the end results are still for the benefit of us end users anyway!!!  icon_rolleyes.gif
*
Hah! I do know it will not be 50% faster than GDDR4 at its introduction. As it is right now GDDR4 is still ramping up. And as i said, it really doesnt matter what gddr version number it is (or bus width even, the zomg why still 128bit complaints about the midrange cards are quite pointless). The most important factor is memory bandwidth (and how it can be utilized, R600 has massive bandwidth which it can't even seem to use), and most people seem to overlook this.
TSkmarc
post Jun 9 2007, 12:46 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(ikanayam @ Jun 9 2007, 12:39 PM)
Hah! I do know it will not be 50% faster than GDDR4 at its introduction. As it is right now GDDR4 is still ramping up. And as i said, it really doesnt matter what gddr version number it is (or bus width even, the zomg why still 128bit complaints about the midrange cards are quite pointless). The most important factor is memory bandwidth (and how it can be utilized, R600 has massive bandwidth which it can't even seem to use), and most people seem to overlook this.
*
True. However, I'm wondering whether "Qimonda" skipped GDDR4 because the performance gains is not as much compared to GDDR3. I guess we just have to wait and see how it performs..... hmm.gif
LEVIATHAN
post Jun 9 2007, 12:47 PM

Master Chief Carl M. Brashear
*******
Senior Member
2,281 posts

Joined: Oct 2006
From: Littleroot Town



QUOTE
Lastly, I'm not into debating about DDR2 vs DDR, just taking it as an example/analogy. Technology is advancing and we welcome them with open arms as the end results are still for the benefit of us end users anyway!!!


lolx. it's good for the business well being la dude. all enthusiast short mind will ne trapped and they will spend like shit. meh. technology is strongly manipulated by economic imperialist nowadays. even gddr88 wont impress me.
shinjun
post Jun 9 2007, 03:22 PM

Look at all my stars!!
*******
Senior Member
2,247 posts

Joined: Jan 2007



new technology doesnt really work really well than old tech sometimes
TSkmarc
post Jun 9 2007, 03:41 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(LEVIATHAN @ Jun 9 2007, 12:47 PM)
lolx. it's good for the business well being la dude. all enthusiast short mind will ne trapped and they will spend like shit. meh. technology is strongly manipulated by economic imperialist nowadays. even gddr88 wont impress me.
*
Well, business is business. Nobody forces anybody to buy new tech/products.

It's fine with me if new technology doesn't impress you. Just remember that the appliances/computers/cars/gadgets that you are currently using (and probably take for granted) all comes from advancement in technology. doh.gif

Anyway, back to the topic. GDDR5 is planned to have lower voltages, smaller chips, higher frequencies and increased bandwidth...... rclxms.gif


Faint
post Jun 9 2007, 05:29 PM

Moving forward :)
*******
Senior Member
2,474 posts

Joined: Dec 2006
WTf..... doh.gif
DDR4 haven being popular, ddr5 is coming??? wasei
skydna
post Jun 10 2007, 02:10 AM

Getting Started
**
Junior Member
236 posts

Joined: Jan 2003
QUOTE(goldfries @ Jun 8 2007, 11:57 PM)
not something i'd bother. as long as i get 20 - 30 fps, i'm good.
*
when u play nfs carbon with over 70fps u will feel the real speed..........
shinjun
post Jun 10 2007, 09:10 AM

Look at all my stars!!
*******
Senior Member
2,247 posts

Joined: Jan 2007



above 50fps is consider good already...n i dun think nid 70fps to feel the real speed
jinaun
post Jun 10 2007, 12:05 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
DDR4 , the digit '4' means die in some asian believe.. LOL.. tats y some people skip terus to DDR5


porkchop
post Jun 10 2007, 01:00 PM

Lalala Life's Sweet
*******
Senior Member
6,633 posts

Joined: Jan 2003
From: www.kelvinchiew.com


it oculd be rm3k or rm4k??
skydna
post Jun 10 2007, 02:52 PM

Getting Started
**
Junior Member
236 posts

Joined: Jan 2003
QUOTE(shinjun @ Jun 10 2007, 09:10 AM)
above 50fps is consider good already...n i dun think nid 70fps to feel the real speed
*
ya over 50fps is very smooth already
70fps is to avoid when car crashing drop fps......... laugh.gif
faez_ridzal
post Jul 20 2007, 08:58 PM

New Member
*
Junior Member
25 posts

Joined: Jan 2006


QUOTE(ikanayam @ Jun 8 2007, 04:55 PM)
Wait for what? Does it make an actual difference to the end user what TYPE of memory is used?
*
good question and the answer is nope.
GDDR4/5 is just an advancements to memory in terms of efficiency and bandwidth.
just like DDR2->DDR3 lah, minor improvements but it runs at reduced voltages...for us to overclock some day haha. joking
verz84
post Jul 20 2007, 09:34 PM

On my way
****
Senior Member
592 posts

Joined: Nov 2006


hm...so fast..so no need to fast grab the DDR4 GC but wait patiently for the DDR5 to come out
shinjun
post Jul 20 2007, 11:21 PM

Look at all my stars!!
*******
Senior Member
2,247 posts

Joined: Jan 2007



QUOTE(verz84 @ Jul 20 2007, 09:34 PM)
hm...so fast..so no need to fast grab the DDR4 GC but wait patiently for the DDR5 to come out
*
so far..GDDR3 is already enuf laugh.gif
-Kira-
post Jul 21 2007, 12:07 AM

New Member
*
Junior Member
27 posts

Joined: Jul 2007
From: Seremban


I from seremban...
I m working in a computer shop
I wander coz my shop doesnt even sell DDR 3 RAM...
=.=" But DDR 4 n 5 is emerging...cant believe
Wat a downgraded state I m staying at
hahaha....^o^
How much it cost for DDR3? for 512MB...jsut asking
Wander how much it will cost
fiqir
post Jul 21 2007, 12:17 AM

BE YOURSELF
*******
Senior Member
3,810 posts

Joined: Jan 2006



so fast la.. laugh.gif
fcuk90
post Jul 21 2007, 12:17 AM

ef eg ek es
*******
Senior Member
7,863 posts

Joined: May 2007
From: highbury


QUOTE(-Kira- @ Jul 21 2007, 12:07 AM)
I from seremban...
I m working in a computer shop
I wander coz my shop doesnt even sell DDR 3 RAM...
=.=" But DDR 4 n 5 is emerging...cant believe
Wat a downgraded state I m staying at
hahaha....^o^
How much it cost for DDR3? for 512MB...jsut asking
Wander how much it will cost
*
around rm350-rm1000?


the x1650 with ddr3 256mb perform better than 7600gt?
shinjun
post Jul 21 2007, 12:26 AM

Look at all my stars!!
*******
Senior Member
2,247 posts

Joined: Jan 2007



QUOTE(fcuk90 @ Jul 21 2007, 12:17 AM)
around rm350-rm1000?
the x1650 with ddr3 256mb perform better than 7600gt?
*
x1650pro<7600GT<x1650xt (correct me if i'm wrong)
-Kira-
post Jul 21 2007, 12:57 AM

New Member
*
Junior Member
27 posts

Joined: Jul 2007
From: Seremban


Wah~~~ so costly ar?
=.= But actually wat is RAM for? Actually purpose
Still not clear of it...n how to diffrence its performance?

shinjun
post Jul 21 2007, 01:03 AM

Look at all my stars!!
*******
Senior Member
2,247 posts

Joined: Jan 2007



this thread is talking about VGA ram
we'r not talking about the pc RAM here biggrin.gif
pls make clear of that
HaHaNoCluE
post Jul 21 2007, 01:23 AM

Newbie
****
Senior Member
628 posts

Joined: Oct 2006


maybe gddr5 will have lower voltages thus produce less heat so maybe easier to cool down the memory???
-Kira-
post Jul 21 2007, 01:37 AM

New Member
*
Junior Member
27 posts

Joined: Jul 2007
From: Seremban


Oh...okok I didnt see ma >.<
Wat i see is DDR4 up there....
if that DDR3 Graphic Card my shop got le
lolx...But what does tat matters?
SUSbudakdegilz
post Jul 21 2007, 01:39 AM

Regular
******
Senior Member
1,138 posts

Joined: Dec 2006



edited..wrong posting

This post has been edited by budakdegilz: Jul 21 2007, 01:41 AM
verz84
post Jul 21 2007, 01:59 AM

On my way
****
Senior Member
592 posts

Joined: Nov 2006


QUOTE(HaHaNoCluE @ Jul 21 2007, 01:23 AM)
maybe gddr5 will have lower voltages thus produce less heat so maybe easier to cool down the memory???
*
yup..some sort like dat la..it it power efficent and more speed than the lower grade


Added on July 21, 2007, 2:01 am
QUOTE(-Kira- @ Jul 21 2007, 01:37 AM)
Oh...okok I didnt see ma >.<
Wat i see is DDR4 up there....
if that DDR3 Graphic Card my shop got le
lolx...But what does tat matters?
*
it matters a lot la bro..for example..if we compare 7300GT DDR2 with 7300GT DDR3...the GC with DDR3 is far more powerful than the DDR2 version..so here is the proof to show that it really matters

This post has been edited by verz84: Jul 21 2007, 02:01 AM
Ah Shawn
post Jul 21 2007, 03:00 AM

On my way
****
Senior Member
512 posts

Joined: Apr 2006
From: perak or KL
QUOTE(kmarc @ Jun 9 2007, 03:41 PM)
Anyway, back to the topic. GDDR5 is planned to have lower voltages, smaller chips, higher frequencies and increased bandwidth......  rclxms.gif
*
funny thing is in d article it says qimonda will be using d 75nm process when 2900xt's r edy using 65nm manufacturing process..funny..want 2 use ddr5 tech but use older manufacturing process
X.E.D
post Jul 21 2007, 06:32 AM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


So far like DDR3 for desktops, it just gives a nice, warm placebo effect.

Do the new cards need a new type of RAM? No. They haven't even pinned down GDDR4 latencies with better chips yet, what'd you expect from GDDR FIVE?

For midranges, nVidia might be halving its 8800's 320bit bus for a little better bandwidth while ATi might go X1950 of all sorts and use 256bit with higher prices.
ikanayam
post Jul 21 2007, 09:54 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(X.E.D @ Jul 20 2007, 05:32 PM)
So far like DDR3 for desktops, it just gives a nice, warm placebo effect.

Do the new cards need a new type of RAM? No. They haven't even pinned down GDDR4 latencies with better chips yet, what'd you expect from GDDR FIVE?

For midranges, nVidia might be halving its 8800's 320bit bus for a little better bandwidth while ATi might go X1950 of all sorts and use 256bit with higher prices.
*
For graphics purposes, the slight increase in latency doesn't matter much. The important thing is improving bandwidth, because GPUs are made to hide latency and maximize bandwidth. GDDR5 doesn't seem like a huge change, they're not doubling the internal prefetch IIRC, so i don't see the latencies increasing. It's something like GDDR2->GDDR3, some refinements, but it could go a long way.
soulfly
post Jul 21 2007, 11:45 AM

revving towards 10,000 rpm
Group Icon
VIP
15,903 posts

Joined: Jan 2003
From: Miri



people (consumer) wait for new graphic cards, not new gDDR chips
sasaug
post Jul 22 2007, 01:18 PM

Small Fud
******
Senior Member
1,936 posts

Joined: Nov 2006
From: Klang,Selangor



gDDRchips wont affect ppl like pc noobies(mean they dunno the diff between gc)...for them ore memory means better.....gddr3 performs welledi..why for pay more?
shinjun
post Jul 22 2007, 01:28 PM

Look at all my stars!!
*******
Senior Member
2,247 posts

Joined: Jan 2007



QUOTE(sasaug @ Jul 22 2007, 01:18 PM)
gDDRchips wont affect ppl like pc noobies(mean they dunno the diff between gc)...for them ore memory means better.....gddr3 performs welledi..why for pay more?
*
yea..agree..GDDR3 is already very good drool.gif
SlayerXT
post Jul 22 2007, 01:28 PM

PRIDE!
*******
Senior Member
2,042 posts

Joined: Jan 2003
From: KL



By that time PCIe 2.0 is a must bcoz already bandwidth bottleneck on PCIe 1.1.
Lemmings
post Jul 22 2007, 02:45 PM

On my way
****
Senior Member
522 posts

Joined: May 2007
From: Bolehland


that means more money, if it's like that I rather stick to GDDR3 for a while tongue.gif
billytong
post Jul 22 2007, 04:31 PM

Lord Sauron
*******
Senior Member
4,522 posts

Joined: Jan 2003
From: Mordor, Middle Earth.


It seems the RAM engineers are making breaktrough consistantly, I wonder will they ever hit the wall.
Lemmings
post Jul 22 2007, 04:38 PM

On my way
****
Senior Member
522 posts

Joined: May 2007
From: Bolehland


whenever that happens, we'll be screaming n shouting for newer products tongue.gif
~hunter~
post Jul 22 2007, 09:11 PM

PLsS bE PatiEnT (I'm FasTinG)
****
Senior Member
684 posts

Joined: Apr 2006
From: FPSO kwame nkrumah



i rather stick wit my gddr3 rather then spend thousand to upgrade to gddr5..myb will upgrade when gddr3 getting obsolete...
ronaldjoe
post Jul 22 2007, 09:20 PM

Look at all my stars!!
*******
Senior Member
3,569 posts

Joined: Apr 2007
I will only go for it when it becomes mainstream hardware.
It's kinda expensive. I couldn't afford chasing new technology by spending huge sum of $$$. cry.gif cry.gif cry.gif

Someonesim
post Jul 25 2007, 05:40 PM

In my way
*******
Senior Member
9,132 posts

Joined: Aug 2005



Bandwidth is important , like GDDR3 -0.8ns ( use on 8800 Ultra ) had 2.5Ghz DDR , but GDDR5 will start at 3.5GHz ~ 4GHZ DDR . Just compare and you'll see who is faster . GDDR4 only ( until now ) used on 1 product which is Radeon X1950 XTX , so pity cry.gif
TSkmarc
post Nov 2 2007, 12:00 AM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



Qimonda ships first GDDR5 memory devices!

Still a bit to wait but here's some more news..... http://www.tgdaily.com/content/view/34659/118/

SlayerXT
post Nov 2 2007, 02:02 AM

PRIDE!
*******
Senior Member
2,042 posts

Joined: Jan 2003
From: KL



Looks like they struggling to be the first to bring out GDDR5 to the market. So they really skipped GDDR4 huh?
t3chn0m4nc3r
post Nov 2 2007, 07:05 PM

Teh Necron Lord
*******
Senior Member
4,139 posts

Joined: Sep 2006
From: Internet


QUOTE(§layerXT @ Nov 2 2007, 03:02 AM)
Looks like they struggling to be the first to bring out GDDR5 to the market. So they really skipped GDDR4 huh?
*
becuz 4 is not a good number for Asians especially our main manufacturing country... whistling.gif
ac_N1
post Nov 3 2007, 01:57 AM

brotherhood of the leaves
******
Senior Member
1,029 posts

Joined: Apr 2007
From: All Blue
Personally i think having GDDR5 is kinda overkill hmm.gif
GDDR 3 already manage to play all those high resolution games up to date.
ikanayam
post Nov 3 2007, 04:48 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(ac_N1 @ Nov 2 2007, 12:57 PM)
Personally i think having GDDR5 is kinda overkill  hmm.gif 
GDDR 3 already manage to play all those high resolution games up to date.
*
"up to date" is the key phrase here. Some of us plan to live a little longer.
TSkmarc
post Nov 3 2007, 05:58 AM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(ac_N1 @ Nov 3 2007, 01:57 AM)
Personally i think having GDDR5 is kinda overkill  hmm.gif 
GDDR 3 already manage to play all those high resolution games up to date.
*
There's nothing wrong with technology pushing forward. Granted that it is already at a tremendous pace but I should think that any new technology will be utilized to the fullest and become "old tech" not long after that.....

QUOTE(ikanayam @ Nov 3 2007, 04:48 AM)
"up to date" is the key phrase here. Some of us plan to live a little longer.
*
Ya ya, I want to live long enough to see GDDR20..... laugh.gif
nelienuxe_sara
post Nov 3 2007, 09:51 AM

noob im ur father
*******
Senior Member
2,546 posts

Joined: Jan 2005
From: far far away...
huahuahua
go eat vitamins alot bro
dun smoke
eat well
and execise
hope u can make it

i dun care the ddr1/2/3/4
as long as the output is good
plus the price is affordable
i go for it
sHawTY
post Nov 3 2007, 04:11 PM

Frequent Reporter
********
All Stars
14,909 posts

Joined: Jul 2005

Haven't see any GC that uses GDDR5.
Even finding GC's that uses GDDR4 is kinda hard. rolleyes.gif
TSkmarc
post Nov 3 2007, 04:42 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(sHawTY @ Nov 3 2007, 04:11 PM)
Haven't see any GC that uses GDDR5.
Even finding GC's that uses GDDR4 is kinda hard. rolleyes.gif
*
Errr... sample volumes just went out la..... products with GDDR5 only coming out H2 2008. See post #51 for the link. smile.gif
ianho
post Nov 3 2007, 05:20 PM

Cucimangkoklife
Group Icon
VIP
15,705 posts

Joined: Mar 2005
From: Tg. Rambutan
QUOTE(kmarc @ Nov 3 2007, 05:58 AM)
There's nothing wrong with technology pushing forward. Granted that it is already at a tremendous pace but I should think that any new technology will be utilized to the fullest and become "old tech" not long after that.....
Ya ya, I want to live long enough to see GDDR20.....  laugh.gif
*
Wei, careful what u wish for. Supposing a new 1 comes out every year, and now already at GDDR5, u only haf 15 years left! laugh.gif
TSkmarc
post Nov 3 2007, 05:36 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(ianho @ Nov 3 2007, 05:20 PM)
Wei, careful what u wish for. Supposing a new 1 comes out every year, and now already at GDDR5, u only haf 15 years left!  laugh.gif
*
Won't be so fast la. If not because of Qimonda, GDDR4 would have been mainstream for another couple of years. By the time GDDR20 comes out, I'll probably be 60 years old. At that time, if I can game with GDDR20, I'll die a happy man!!! laugh.gif

This post has been edited by kmarc: Nov 3 2007, 05:36 PM
verticalar
post Nov 5 2007, 03:26 PM

JackOfallTraders
******
Store Representative
1,228 posts

Joined: Jan 2003
From: NORTH sYdE
very fast lo...RAM...
...
wonder the next gen gfx card will have these babies...
ronho
post Nov 7 2007, 11:10 PM

Regular
******
Senior Member
1,356 posts

Joined: Dec 2006
From: Subang


really wonder if mid end users can see the diff between ddr3 and ddr4...any of you guys actually tested these ddrs out to see the differences?? just wanted to know..as sometimes the suppliers promote but the diff not so much but users pay through the nose..
mgxbox
post Nov 9 2007, 09:54 PM

Regular
******
Banned
1,112 posts

Joined: Oct 2007
QUOTE(ronho @ Nov 7 2007, 11:10 PM)
really wonder if mid end users can see the diff between ddr3 and ddr4...any of you guys actually tested these ddrs out to see the differences??  just wanted to know..as sometimes the suppliers promote but the diff not so much but users pay through the nose..
*
If you look at the DDR3 & DDR4 you think it look ahead but the speed is not very much in different. It depend on many factor, not just the RAM itself. Example, an ATI 2600XT DDR4 lose to NVIDIA 8600GT DDR3 in most of the games benchmark so you know there is not much different actually, not now maybe 1 or 2 years later when those software made to recognizes it. If comparing DDR2 to DDR4, yes there is a lot of different. smile.gif
t3chn0m4nc3r
post Nov 9 2007, 10:56 PM

Teh Necron Lord
*******
Senior Member
4,139 posts

Joined: Sep 2006
From: Internet


QUOTE(sHawTY @ Nov 3 2007, 05:11 PM)
Haven't see any GC that uses GDDR5.
Even finding GC's that uses GDDR4 is kinda hard. rolleyes.gif
*
err... really...? blink.gif

actually up to date games still lags in my current rig which i spent almost RM4k to build... sweat.gif
if u r talking bout o9 games... please don compare cuz i can play them wif Intel GMA950... lowly hardware for lowly software... icon_idea.gif
khaidani
post Nov 10 2007, 07:53 AM

Casual
***
Junior Member
385 posts

Joined: Jan 2003
From: banting,serdang,uniten



seems ATi failed to benefits GDDR4 on their current gc? tried my friend's hd2600xt ddr4 but seems my x1950pro is much better
Createmous
post Nov 12 2007, 11:42 PM

On my way
****
Senior Member
638 posts

Joined: Nov 2007
QUOTE(khaidani @ Nov 10 2007, 07:53 AM)
seems ATi failed to benefits GDDR4 on their current gc? tried my friend's hd2600xt ddr4 but seems my x1950pro is much better
*
Friends of mine also say that GDDR4 really not nesessary at this moment if not nvidia already use GDDR4 on it 8800GT.
besaid
post Dec 3 2007, 04:23 PM

On my way
****
Senior Member
668 posts

Joined: Jul 2006
From: Klang Valley


QUOTE(Createmous @ Nov 12 2007, 11:42 PM)
Friends of mine also say that GDDR4 really not nesessary at this moment if not nvidia already use GDDR4 on it 8800GT.
*
your friend...is an idiot.dont listen to him from now onwards...

GeForce 8series Wiki
ikanayam
post Dec 3 2007, 08:35 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(besaid @ Dec 3 2007, 03:23 AM)
your friend...is an idiot.dont listen to him from now onwards...

GeForce 8series Wiki
*
So... what's the wiki link for? Where does it prove your point?
TSkmarc
post Dec 3 2007, 08:44 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(Createmous @ Nov 12 2007, 11:42 PM)
Friends of mine also say that GDDR4 really not nesessary at this moment if not nvidia already use GDDR4 on it 8800GT.
*
Well, do you realize that quad core is also not necessary for most people at the moment? If not for folding, I wouldn't want a quad core now anyway..... wink.gif

Besides, what is the capability of current GDDR4 in ATI's 3870? Did somebody mentioned 2.6Ghz..... drool.gif drool.gif
SlayerXT
post Dec 4 2007, 01:39 AM

PRIDE!
*******
Senior Member
2,042 posts

Joined: Jan 2003
From: KL



Samsung already debut world's fastest GDDR5 can operates at whopping 24GBytes/s. Thats really crazy. By that time already can play crysis at 60fps++ ultra high.
Terence573
post Dec 4 2007, 12:49 PM

wow!!!!!
*******
Senior Member
2,459 posts

Joined: May 2006
From: Land Below the Wind


Huhu why no DDR5 in RAM.
Anyways DDR and GDDR is different.
For solid evidence, ATI uses GDDR4 for their cards as Nvidia just stay with GDDR3.But the result is nvidia still pawn the GDDR4 in term of performance.
But I'm not saying GDDR4 is not good.Just maybe the performance not optimized by ATi.If nvidia to use GDDR4 maybe is good.But considering nvidia haven't use a single GDDR4 for ther 8 series card,just make me wonder why...
InnerMax
post Dec 4 2007, 12:55 PM

Getting Started
**
Junior Member
256 posts

Joined: Nov 2007
From: Kota Kinabalu, Sabah



QUOTE(ikanayam @ Dec 3 2007, 08:35 PM)
So... what's the wiki link for? Where does it prove your point?
*
shocking.gif maybe its a guide or somekind of enlightenment for some... errr.. i dunno, really whats the wiki for? blink.gif
TSkmarc
post Dec 5 2007, 07:56 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



Hmmmm.... looks like Hynix is also skipping GDDR4 and jumping to GDDR5..... http://www.xbitlabs.com/news/memory/displa...mory_Chips.html

Quote from above article "With its improved speed and power characteristics, GDDR5 is projected to succeed GDDR3 and dominate the graphics DRAM market from the second half of 2008."
ikanayam
post Dec 5 2007, 07:58 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(Terence573 @ Dec 3 2007, 11:49 PM)
Huhu why no DDR5 in RAM.
Anyways DDR and GDDR is different.
For solid evidence, ATI uses GDDR4 for their cards as Nvidia just stay with GDDR3.But the result is nvidia still pawn the GDDR4 in term of performance.
But I'm not saying GDDR4 is not good.Just maybe the performance not optimized by ATi.If nvidia to use GDDR4 maybe is good.But considering nvidia haven't use a single GDDR4 for ther 8 series card,just make me wonder why...
*
It's for political reasons rather than technological.
Terence573
post Dec 5 2007, 09:05 PM

wow!!!!!
*******
Senior Member
2,459 posts

Joined: May 2006
From: Land Below the Wind


QUOTE(ikanayam @ Dec 5 2007, 07:58 PM)
It's for political reasons rather than technological.
*
Political? How to say?
timljh
post Dec 7 2007, 06:42 PM

Getting Started
**
Junior Member
150 posts

Joined: Feb 2007


QUOTE(Terence573 @ Dec 4 2007, 12:49 PM)
Huhu why no DDR5 in RAM.
Anyways DDR and GDDR is different.
For solid evidence, ATI uses GDDR4 for their cards as Nvidia just stay with GDDR3.But the result is nvidia still pawn the GDDR4 in term of performance.
But I'm not saying GDDR4 is not good.Just maybe the performance not optimized by ATi.If nvidia to use GDDR4 maybe is good.But considering nvidia haven't use a single GDDR4 for ther 8 series card,just make me wonder why...
*
i suppose they are still new with the GDDR4 juz like they shift one 90nm to 65nm, it takes some times for them to be mature..

This post has been edited by timljh: Dec 7 2007, 06:46 PM
CYBERJUDGE
post Dec 8 2007, 11:53 AM

Death Is Only The Beginning
****
Senior Member
671 posts

Joined: Jan 2003


samsung is the one who is releasing it... Change your mobo ... sell your gc....

smile.gif
TSkmarc
post May 11 2008, 09:23 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



Some updates on GDDR5 : GDDR5 in production

QUOTE
Qimonda, a leading manufacturer of advanced dynamic random access memory (DRAM), said that it could deliver next-generation GDDR5 memory for graphics cards and other applications that require high memory bandwidth in volume. At this point Qimonda can supply makers of graphics boards GDDR5 memory with up to 4.50GHz clock-speed.

TSkmarc
post Nov 25 2008, 12:12 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



Hynix Semiconductor Introduces 7GHz GDDR5 Memory!!! shocking.gif

http://www.xbitlabs.com/news/memory/displa...DR5_Memory.html
X.E.D
post Nov 25 2008, 12:38 PM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


I wonder how you cool it. laugh.gif

Maybe we'll have a peltier stuck to these RAM modules, while the GPU itself merely has a 128/256 bit bus. tongue.gif


Nah, GDDR5 needs to get cheap first. 4/5Ghz modules at 1GB capacities seem to be making the 4870 less attractive vs the GTX 260, so we need that to come down first before making the next moves.

On the other hand, a 192 bit bus, 768MB 5/6Ghz GDDR5 sounds great for a new mid-high chip. smile.gif
TSkmarc
post Nov 25 2008, 08:40 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



The lower voltage would help, I guess...... maybe they will introduce heatpipes for the GDDR5! hmm.gif
KenDiriwan
post Nov 25 2008, 09:06 PM


*******
Senior Member
2,277 posts

Joined: Sep 2007
From: It's a tarp


Any picture of the actual product!?......yet?
maxburnz
post Nov 27 2008, 04:25 PM

On my way
****
Senior Member
698 posts

Joined: Jun 2008

last time i used 7300gt...256mb ddr2 and 128mb ddr3.
ddr3 wins..so i think higher ddr give at least a slight increase in performance or sometimes maybe negligible

 

Change to:
| Lo-Fi Version
0.0396sec    0.54    5 queries    GZIP Disabled
Time is now: 22nd December 2025 - 03:20 PM