Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

 Definitive R6xx thread, Some more solid info

views
     
t3chn0m4nc3r
post May 27 2007, 12:41 PM

Teh Necron Lord
*******
Senior Member
4,139 posts

Joined: Sep 2006
From: Internet


most probably not.... juz wait and see or AMD will face a great pressure...
ruffstuff
post May 27 2007, 04:58 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(X.E.D @ May 27 2007, 12:32 PM)
Not really.
Did that author get paid off by nVidia? Seems to make a HUGE deal out of losing entropy decoding/bitstream processing. 8800 doesn't have'em too, practically keeping them at an even playing level for now.

Marketing BS, but this wouldn't even compare to the magnitude that Creative has done with their Audigy and X-Fi cards. Those are much easier to get a lawsuit on than this one.

The important factor will be the redux flagship chips (HD2950XT+/8900+), in which demanding full AVIVO HD/Purevideo HD v2 would be reasonable.

More importantly, hi-def video is more prevalent on H/X264 files than HD-DVD/BD, and there's a very nut-zy codec called CoreAVC that high-end GC users could buy without even blinking their eyes. 1080p H.264 fulfilled!
*
At least NVIDIA made it less vague statement compare to AMD. On nvidia site, they mention about all Geforce 8 series are having Purevideo HD technology with a small notation below saying only 8500 and 8600 are supported. Still confusing, but at least the press are aware before even the launch of mainstream Geforce 8 series regarding missing HD decoder on G80 series. No broken promise happened.
Made me think, launching both R600 and RV6XX on the same time kinda messed up things a little bit.
blindbox
post May 28 2007, 12:10 AM

Meh
******
Senior Member
1,705 posts

Joined: Nov 2004


QUOTE(X.E.D @ May 27 2007, 12:32 PM)
Not really.
Did that author get paid off by nVidia? Seems to make a HUGE deal out of losing entropy decoding/bitstream processing. 8800 doesn't have'em too, practically keeping them at an even playing level for now.

Marketing BS, but this wouldn't even compare to the magnitude that Creative has done with their Audigy and X-Fi cards. Those are much easier to get a lawsuit on than this one.

The important factor will be the redux flagship chips (HD2950XT+/8900+), in which demanding full AVIVO HD/Purevideo HD v2 would be reasonable.

More importantly, hi-def video is more prevalent on H/X264 files than HD-DVD/BD, and there's a very nut-zy codec called CoreAVC that high-end GC users could buy without even blinking their eyes. 1080p H.264 fulfilled!
*
Uh, I have CoreAVC codec right here, included with k-lite mega codec pack, hmm. I wonder..... was my way teh *ahem* way? lol.

This post has been edited by blindbox: May 28 2007, 12:14 AM
jinaun
post Dec 16 2007, 03:55 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
leaky R680 proto boards???



sos: http://en.expreview.com/?p=115


Attached thumbnail(s)
Attached Image Attached Image
ben_panced
post Dec 16 2007, 04:09 PM

PC and MotorBicycle Enthusiast
*****
Senior Member
962 posts

Joined: Dec 2004
From: Kulai


where's the ram chip..
cant see any rclxub.gif
jinaun
post Dec 16 2007, 04:31 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(ben_panced @ Dec 16 2007, 04:09 PM)
where's the ram chip..
cant see any  rclxub.gif
*
its embedded into the GPU with total bus width of 2560-bit (composed of three independent buses: 1024-bit write, 1024-bit read, 512-bit read/write) working at 8 GHz

» Click to show Spoiler - click again to hide... «


This post has been edited by jinaun: Dec 16 2007, 04:40 PM
clayclws
post Dec 16 2007, 04:50 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(jinaun @ Dec 16 2007, 04:31 PM)
its embedded into the GPU with total bus width of 2560-bit (composed of three independent buses: 1024-bit write, 1024-bit read, 512-bit read/write) working at 8 GHz
*
Haha~! If only it's true...Guess it's going to take a 12nm process or smaller for that to happen. Now that'll be cool wink.gif
[attachmentid=360983][attachmentid=360985]
The blue squares are where the GDDR3/4 are supposed to be.

This post has been edited by clayclws: Dec 16 2007, 05:00 PM
clayclws
post Dec 17 2007, 07:40 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


I wonder if it's worth the wait...January ain't that far...but 31st January is still January...
[attachmentid=361794]
[attachmentid=361795]
[attachmentid=361796]
smokey
post Dec 17 2007, 08:13 PM

Infinity speed
*******
Senior Member
3,506 posts

Joined: Jan 2003
From: Lumpur
dunno january which date...haiz...if we know, then we can prepare for cheaper hd3850 and hd3870...
clayclws
post Dec 17 2007, 08:16 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


I'm worried about the GDDR3 usage...it's not as energy efficient or OC-able as GDDR4. They've used GDDR4...why are they taking a step back for their flagship product? Hope it's not accurately reported.

This post has been edited by clayclws: Dec 17 2007, 08:16 PM
jinaun
post Dec 17 2007, 11:09 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(clayclws @ Dec 17 2007, 08:16 PM)
I'm worried about the GDDR3 usage...it's not as energy efficient or OC-able as GDDR4. They've used GDDR4...why are they taking a step back for their flagship product? Hope it's not accurately reported.
*
perhaps GDDR3 is much cheaper.. and clock for clock.. is faster than GDDR4 due to additional latencies associated with GDDR4

GDDR4 will onli be faster than GDDR3 if its clocked high enough..

eg.. take DDR400 vs DDR2-400
arjuna_mfna
post Dec 18 2007, 02:25 AM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(ben_panced @ Dec 16 2007, 04:09 PM)
where's the ram chip..
cant see any  rclxub.gif
*
this photo was board sample (with no ram chip) this is unfinish product...
clayclws
post Dec 18 2007, 10:39 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(jinaun @ Dec 17 2007, 11:09 PM)
perhaps GDDR3 is much cheaper.. and clock for clock.. is faster than GDDR4 due to additional latencies associated with GDDR4

GDDR4 will onli be faster than GDDR3 if its clocked high enough..

eg.. take DDR400 vs DDR2-400
*
Hmm...I have never seen such thing happening in the Graphics RAM department before. I always have the impression that GDDR4 is a better spec-ed GDDR3 - an improvement in energy efficiency, increased official speed, etc. In other words, it is like Vios 2007 compared to the ordinary Vios. I guess I better go dig up more from my friends and web.
skylinegtr34rule4life
post Dec 18 2007, 12:52 PM

13k elite :P
********
Senior Member
13,340 posts

Joined: Feb 2005
From: back from vacation XD



QUOTE(smokey @ Dec 17 2007, 08:13 PM)
dunno january which date...haiz...if we know, then we can prepare for cheaper hd3850 and hd3870...
*
900 bucks is consider so cheap ardy la compare with 1500 bucks 4 crappy 2900XT laugh.gif icon_rolleyes.gif
TSikanayam
post Dec 18 2007, 01:01 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(clayclws @ Dec 17 2007, 09:39 PM)
Hmm...I have never seen such thing happening in the Graphics RAM department before. I always have the impression that GDDR4 is a better spec-ed GDDR3 - an improvement in energy efficiency, increased official speed, etc. In other words, it is like Vios 2007 compared to the ordinary Vios. I guess I better go dig up more from my friends and web.
*
Looks like you probably didn't read what you linked to. GDDR4 saves power and clocks higher not by magic, the internal memory elements are clocked at half the speed of comparable GDDR3, so latency is higher.

This post has been edited by ikanayam: Dec 18 2007, 01:04 PM
sonic_cd
post Dec 18 2007, 02:25 PM

Friendship Is Magic
********
All Stars
19,042 posts

Joined: Jan 2003
From: Soleanna

QUOTE(skylinegtr34rule4life @ Dec 18 2007, 12:52 PM)
900 bucks is consider so cheap ardy la compare with 1500 bucks 4 crappy 2900XT laugh.gif  icon_rolleyes.gif
*
and not so power hungry .lol
clayclws
post Dec 19 2007, 04:38 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(ikanayam @ Dec 18 2007, 01:01 PM)
Looks like you probably didn't read what you linked to. GDDR4 saves power and clocks higher not by magic, the internal memory elements are clocked at half the speed of comparable GDDR3, so latency is higher.
*
My bad. Bad reading comprehension. Was looking for the advantage bits only.

This post has been edited by clayclws: Dec 19 2007, 06:45 PM

16 Pages « < 14 15 16Top
 

Change to:
| Lo-Fi Version
0.0301sec    0.72    6 queries    GZIP Disabled
Time is now: 21st December 2025 - 12:09 AM