Welcome Guest ( Log In | Register )

5 Pages  1 2 3 > » Bottom

Outline · [ Standard ] · Linear+

 GDDR4 - waiting for it?, Look again, it's GDDR5!!!!!

views
     
TSkmarc
post Jun 8 2007, 11:47 PM, updated 18y ago

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



Looks like "Qimonda" is skipping GDDR4 and going straight to GDDR5!!! rclxms.gif

Volume production expected in first half of 2008!!! rclxm9.gif

So, are we going to wait some more?? rclxub.gif rclxub.gif

http://www.digitimes.com/news/a20070607PD215.html
ikanayam
post Jun 8 2007, 11:55 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

Wait for what? Does it make an actual difference to the end user what TYPE of memory is used?
goldfries
post Jun 8 2007, 11:57 PM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




not something i'd bother. as long as i get 20 - 30 fps, i'm good.
SUSdattebayo
post Jun 9 2007, 12:02 AM

Look at all my stars!!
*******
Senior Member
5,366 posts

Joined: Aug 2005


would PCIE 1.1 limited bandwidth causes bottleneck in utilizing gDDR5?
karhoe
post Jun 9 2007, 12:13 AM

Look at all my stars!!
*******
Senior Member
6,238 posts

Joined: Sep 2005
From: Kuala Lumpur


No idea, but there has been a rumour long ago that PCI-e2 is coming soon
Najmods
post Jun 9 2007, 12:27 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


Its good, for 128-bit memory bus width mid range cards, for high end card like HD 2900XT, I don't think it needs a faster memory. Probably on future faster cards might need it but still too early, far too few cards utilising even GDDR-4
taxidoor
post Jun 9 2007, 02:16 AM

Regular
******
Senior Member
1,008 posts

Joined: Mar 2006
From: Kuantan Pahang



if the cost par with DDR4 why dun use GDDR5 ? buyer also will few nicer "P
jinaun
post Jun 9 2007, 09:15 AM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(ikanayam @ Jun 8 2007, 11:55 PM)
Wait for what? Does it make an actual difference to the end user what TYPE of memory is used?
*
LOL..

feel good factor for end users...

eg.. mine is GDDR5, yours are GDDR4

mine pwned urs


LOL
TSkmarc
post Jun 9 2007, 10:04 AM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(ikanayam @ Jun 8 2007, 11:55 PM)
Wait for what? Does it make an actual difference to the end user what TYPE of memory is used?
*
Eh? This question is like asking whether there's any difference in using DDR and DDR2..... rclxub.gif

QUOTE(jinaun @ Jun 9 2007, 09:15 AM)
LOL..

feel good factor for end users...

eg.. mine is GDDR5, yours are GDDR4

mine pwned urs
LOL
*
Ya ya..... if one card offers GDDR4 and the other offers GDDR5, I'd probably go for GDDR5!!! Of course, have to look at the cost also....
ikanayam
post Jun 9 2007, 11:11 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(kmarc @ Jun 8 2007, 09:04 PM)
Eh? This question is like asking whether there's any difference in using DDR and DDR2.....  rclxub.gif
Ya ya..... if one card offers GDDR4 and the other offers GDDR5, I'd probably go for GDDR5!!! Of course, have to look at the cost also....
*
Is there really? Does the end user even know what difference it makes besides the 5 is bigger than 4? Is 5 a guarantee of better performance than 4? Memory bandwidth is a function of bus width which is independent of the model number and frequency which is to some extent helped by new models. Also the newer types tend to have a higher latency at the same clocks, so they have to be clocked much higher before you can see most of the performance benefit. Remember the gddr2 radeon 9800pro which performed worse than the gddr1 version? wink.gif

GDDR4 isn't even that widely used yet. Top end cards are still using gddr3. So this GDDR5 thing is still a way off. I wouldn't hold my breath waiting for it.

This post has been edited by ikanayam: Jun 9 2007, 11:12 AM
lamely_named
post Jun 9 2007, 11:26 AM

I got younger. ROLLZ.
******
Senior Member
1,931 posts

Joined: Jan 2003
From: Human Mixbreeding Farm

I have a noob theory.

The greedy manufacturers have already developed GDDR 10 since year 2000, and they are holding it back, releasing one version at a time.

and somehow this = profit!!!

win??

wait, that means intel already have tacheon flux processors and AMD's orbital mars space station is already developing a time distortion processors, along with ATI's 100% realer than life photorealistic GPU codenamed "Real graphic x696969xtx".

OMG!!! Them greedy bastart!!!

I think i'll just stick with my GDDR3, coz I'm a stupid lamb.



ikanayam
post Jun 9 2007, 11:27 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

That theory doesnt work, unless there was no competition in the industry.
TSkmarc
post Jun 9 2007, 12:33 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(ikanayam @ Jun 9 2007, 11:11 AM)
Is there really? Does the end user even know what difference it makes besides the 5 is bigger than 4? Is 5 a guarantee of better performance than 4? Memory bandwidth is a function of bus width which is independent of the model number and frequency which is to some extent helped by new models. Also the newer types tend to have a higher latency at the same clocks, so they have to be clocked much higher before you can see most of the performance benefit. Remember the gddr2 radeon 9800pro which performed worse than the gddr1 version? wink.gif

GDDR4 isn't even that widely used yet. Top end cards are still using gddr3. So this GDDR5 thing is still a way off. I wouldn't hold my breath waiting for it.
*
Your arguments are somewhat flawed.

Firstly, you can't talk about whether an end user can see or feel any difference between GDDR4 or GDDR5 besides the number. For the average end user, maybe so, but it is like saying that since they are only average user, why not use DDR and not DDR2/DDR3 as they won't feel the difference anyway. If you say that DDR2 is getting cheaper compared to DDR, then the same can be said for GDDR5 compared to GDDR4!

Secondly, the difference to an overclocker is important, eventhough it may be just a minor increase. Let's say that GDDR5 is only 10 fps faster than GDDR4 in real-world games, it is still better for an overclocker!! Again, if you compare the speed that DDR2 gives you as compared to DDR1, is there really any huge gains?

Thirdly, your arguments are only speculations. You do not know how GDDR5 will perform as compared to GDDR4. I mean, what if GDDR5 is 50% faster than GDDR4? What then?

Lastly, I'm not into debating about DDR2 vs DDR, just taking it as an example/analogy. Technology is advancing and we welcome them with open arms as the end results are still for the benefit of us end users anyway!!! icon_rolleyes.gif


ikanayam
post Jun 9 2007, 12:39 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(kmarc @ Jun 8 2007, 11:33 PM)
Your arguments are somewhat flawed.

Firstly, you can't talk about whether an end user can see or feel any difference between GDDR4 or GDDR5 besides the number. For the average end user, maybe so, but it is like saying that since they are only average user, why not use DDR and not DDR2/DDR3 as they won't feel the difference anyway. If you say that DDR2 is getting cheaper compared to DDR, then the same can be said for GDDR5 compared to GDDR4!

Secondly, the difference to an overclocker is important, eventhough it may be just a minor increase. Let's say that GDDR5 is only 10 fps faster than GDDR4 in real-world games, it is still better for an overclocker!! Again, if you compare the speed that DDR2 gives you as compared to DDR1, is there really any huge gains?

Thirdly, your arguments are only speculations. You do not know how GDDR5 will perform as compared to GDDR4. I mean, what if GDDR5 is 50% faster than GDDR4? What then?

Lastly, I'm not into debating about DDR2 vs DDR, just taking it as an example/analogy. Technology is advancing and we welcome them with open arms as the end results are still for the benefit of us end users anyway!!!  icon_rolleyes.gif
*
Hah! I do know it will not be 50% faster than GDDR4 at its introduction. As it is right now GDDR4 is still ramping up. And as i said, it really doesnt matter what gddr version number it is (or bus width even, the zomg why still 128bit complaints about the midrange cards are quite pointless). The most important factor is memory bandwidth (and how it can be utilized, R600 has massive bandwidth which it can't even seem to use), and most people seem to overlook this.
TSkmarc
post Jun 9 2007, 12:46 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(ikanayam @ Jun 9 2007, 12:39 PM)
Hah! I do know it will not be 50% faster than GDDR4 at its introduction. As it is right now GDDR4 is still ramping up. And as i said, it really doesnt matter what gddr version number it is (or bus width even, the zomg why still 128bit complaints about the midrange cards are quite pointless). The most important factor is memory bandwidth (and how it can be utilized, R600 has massive bandwidth which it can't even seem to use), and most people seem to overlook this.
*
True. However, I'm wondering whether "Qimonda" skipped GDDR4 because the performance gains is not as much compared to GDDR3. I guess we just have to wait and see how it performs..... hmm.gif
LEVIATHAN
post Jun 9 2007, 12:47 PM

Master Chief Carl M. Brashear
*******
Senior Member
2,281 posts

Joined: Oct 2006
From: Littleroot Town



QUOTE
Lastly, I'm not into debating about DDR2 vs DDR, just taking it as an example/analogy. Technology is advancing and we welcome them with open arms as the end results are still for the benefit of us end users anyway!!!


lolx. it's good for the business well being la dude. all enthusiast short mind will ne trapped and they will spend like shit. meh. technology is strongly manipulated by economic imperialist nowadays. even gddr88 wont impress me.
shinjun
post Jun 9 2007, 03:22 PM

Look at all my stars!!
*******
Senior Member
2,247 posts

Joined: Jan 2007



new technology doesnt really work really well than old tech sometimes
TSkmarc
post Jun 9 2007, 03:41 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(LEVIATHAN @ Jun 9 2007, 12:47 PM)
lolx. it's good for the business well being la dude. all enthusiast short mind will ne trapped and they will spend like shit. meh. technology is strongly manipulated by economic imperialist nowadays. even gddr88 wont impress me.
*
Well, business is business. Nobody forces anybody to buy new tech/products.

It's fine with me if new technology doesn't impress you. Just remember that the appliances/computers/cars/gadgets that you are currently using (and probably take for granted) all comes from advancement in technology. doh.gif

Anyway, back to the topic. GDDR5 is planned to have lower voltages, smaller chips, higher frequencies and increased bandwidth...... rclxms.gif


Faint
post Jun 9 2007, 05:29 PM

Moving forward :)
*******
Senior Member
2,474 posts

Joined: Dec 2006
WTf..... doh.gif
DDR4 haven being popular, ddr5 is coming??? wasei
skydna
post Jun 10 2007, 02:10 AM

Getting Started
**
Junior Member
236 posts

Joined: Jan 2003
QUOTE(goldfries @ Jun 8 2007, 11:57 PM)
not something i'd bother. as long as i get 20 - 30 fps, i'm good.
*
when u play nfs carbon with over 70fps u will feel the real speed..........

5 Pages  1 2 3 > » Top
 

Change to:
| Lo-Fi Version
0.0184sec    0.63    5 queries    GZIP Disabled
Time is now: 22nd December 2025 - 01:29 PM