most probably not.... juz wait and see or AMD will face a great pressure...
Definitive R6xx thread, Some more solid info
Definitive R6xx thread, Some more solid info
|
|
May 27 2007, 12:41 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,139 posts Joined: Sep 2006 From: Internet |
most probably not.... juz wait and see or AMD will face a great pressure...
|
|
|
|
|
|
May 27 2007, 04:58 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(X.E.D @ May 27 2007, 12:32 PM) Not really. At least NVIDIA made it less vague statement compare to AMD. On nvidia site, they mention about all Geforce 8 series are having Purevideo HD technology with a small notation below saying only 8500 and 8600 are supported. Still confusing, but at least the press are aware before even the launch of mainstream Geforce 8 series regarding missing HD decoder on G80 series. No broken promise happened. Did that author get paid off by nVidia? Seems to make a HUGE deal out of losing entropy decoding/bitstream processing. 8800 doesn't have'em too, practically keeping them at an even playing level for now. Marketing BS, but this wouldn't even compare to the magnitude that Creative has done with their Audigy and X-Fi cards. Those are much easier to get a lawsuit on than this one. The important factor will be the redux flagship chips (HD2950XT+/8900+), in which demanding full AVIVO HD/Purevideo HD v2 would be reasonable. More importantly, hi-def video is more prevalent on H/X264 files than HD-DVD/BD, and there's a very nut-zy codec called CoreAVC that high-end GC users could buy without even blinking their eyes. 1080p H.264 fulfilled! Made me think, launching both R600 and RV6XX on the same time kinda messed up things a little bit. |
|
|
May 28 2007, 12:10 AM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,705 posts Joined: Nov 2004 |
QUOTE(X.E.D @ May 27 2007, 12:32 PM) Not really. Uh, I have CoreAVC codec right here, included with k-lite mega codec pack, hmm. I wonder..... was my way teh *ahem* way? lol.Did that author get paid off by nVidia? Seems to make a HUGE deal out of losing entropy decoding/bitstream processing. 8800 doesn't have'em too, practically keeping them at an even playing level for now. Marketing BS, but this wouldn't even compare to the magnitude that Creative has done with their Audigy and X-Fi cards. Those are much easier to get a lawsuit on than this one. The important factor will be the redux flagship chips (HD2950XT+/8900+), in which demanding full AVIVO HD/Purevideo HD v2 would be reasonable. More importantly, hi-def video is more prevalent on H/X264 files than HD-DVD/BD, and there's a very nut-zy codec called CoreAVC that high-end GC users could buy without even blinking their eyes. 1080p H.264 fulfilled! This post has been edited by blindbox: May 28 2007, 12:14 AM |
|
|
Dec 16 2007, 03:55 PM
|
|
Elite
6,139 posts Joined: Jan 2003 |
|
|
|
Dec 16 2007, 04:09 PM
|
![]() ![]() ![]() ![]() ![]()
Senior Member
962 posts Joined: Dec 2004 From: Kulai |
where's the ram chip..
cant see any |
|
|
Dec 16 2007, 04:31 PM
|
|
Elite
6,139 posts Joined: Jan 2003 |
QUOTE(ben_panced @ Dec 16 2007, 04:09 PM) its embedded into the GPU with total bus width of 2560-bit (composed of three independent buses: 1024-bit write, 1024-bit read, 512-bit read/write) working at 8 GHz» Click to show Spoiler - click again to hide... « This post has been edited by jinaun: Dec 16 2007, 04:40 PM |
|
|
|
|
|
Dec 16 2007, 04:50 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
QUOTE(jinaun @ Dec 16 2007, 04:31 PM) its embedded into the GPU with total bus width of 2560-bit (composed of three independent buses: 1024-bit write, 1024-bit read, 512-bit read/write) working at 8 GHz Haha~! If only it's true...Guess it's going to take a 12nm process or smaller for that to happen. Now that'll be cool [attachmentid=360983][attachmentid=360985] The blue squares are where the GDDR3/4 are supposed to be. This post has been edited by clayclws: Dec 16 2007, 05:00 PM |
|
|
Dec 17 2007, 07:40 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
I wonder if it's worth the wait...January ain't that far...but 31st January is still January...
[attachmentid=361794] [attachmentid=361795] [attachmentid=361796] |
|
|
Dec 17 2007, 08:13 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,506 posts Joined: Jan 2003 From: Lumpur |
dunno january which date...haiz...if we know, then we can prepare for cheaper hd3850 and hd3870...
|
|
|
Dec 17 2007, 08:16 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
I'm worried about the GDDR3 usage...it's not as energy efficient or OC-able as GDDR4. They've used GDDR4...why are they taking a step back for their flagship product? Hope it's not accurately reported.
This post has been edited by clayclws: Dec 17 2007, 08:16 PM |
|
|
Dec 17 2007, 11:09 PM
|
|
Elite
6,139 posts Joined: Jan 2003 |
QUOTE(clayclws @ Dec 17 2007, 08:16 PM) I'm worried about the GDDR3 usage...it's not as energy efficient or OC-able as GDDR4. They've used GDDR4...why are they taking a step back for their flagship product? Hope it's not accurately reported. perhaps GDDR3 is much cheaper.. and clock for clock.. is faster than GDDR4 due to additional latencies associated with GDDR4GDDR4 will onli be faster than GDDR3 if its clocked high enough.. eg.. take DDR400 vs DDR2-400 |
|
|
Dec 18 2007, 02:25 AM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,496 posts Joined: Jan 2006 From: Baling, Kedah |
|
|
|
Dec 18 2007, 10:39 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
QUOTE(jinaun @ Dec 17 2007, 11:09 PM) perhaps GDDR3 is much cheaper.. and clock for clock.. is faster than GDDR4 due to additional latencies associated with GDDR4 Hmm...I have never seen such thing happening in the Graphics RAM department before. I always have the impression that GDDR4 is a better spec-ed GDDR3 - an improvement in energy efficiency, increased official speed, etc. In other words, it is like Vios 2007 compared to the ordinary Vios. I guess I better go dig up more from my friends and web.GDDR4 will onli be faster than GDDR3 if its clocked high enough.. eg.. take DDR400 vs DDR2-400 |
|
|
|
|
|
Dec 18 2007, 12:52 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
13,340 posts Joined: Feb 2005 From: back from vacation XD |
|
|
|
Dec 18 2007, 01:01 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
10,544 posts Joined: Jan 2003 From: GMT +8:00 |
QUOTE(clayclws @ Dec 17 2007, 09:39 PM) Hmm...I have never seen such thing happening in the Graphics RAM department before. I always have the impression that GDDR4 is a better spec-ed GDDR3 - an improvement in energy efficiency, increased official speed, etc. In other words, it is like Vios 2007 compared to the ordinary Vios. I guess I better go dig up more from my friends and web. Looks like you probably didn't read what you linked to. GDDR4 saves power and clocks higher not by magic, the internal memory elements are clocked at half the speed of comparable GDDR3, so latency is higher.This post has been edited by ikanayam: Dec 18 2007, 01:04 PM |
|
|
Dec 18 2007, 02:25 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
All Stars
19,042 posts Joined: Jan 2003 From: Soleanna |
|
|
|
Dec 19 2007, 04:38 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,659 posts Joined: Sep 2006 From: Miri, PJ & KL |
QUOTE(ikanayam @ Dec 18 2007, 01:01 PM) Looks like you probably didn't read what you linked to. GDDR4 saves power and clocks higher not by magic, the internal memory elements are clocked at half the speed of comparable GDDR3, so latency is higher. My bad. Bad reading comprehension. Was looking for the advantage bits only.This post has been edited by clayclws: Dec 19 2007, 06:45 PM |
| Change to: | 0.0301sec
0.72
6 queries
GZIP Disabled
Time is now: 21st December 2025 - 12:09 AM |