Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

 Definitive R6xx thread, Some more solid info

views
     
TSikanayam
post Mar 1 2007, 11:34 AM, updated 19y ago

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

R600 tidbits abound. Also some on Barcelona. smile.gif
http://www.eetimes.com/news/latest/showArt...cleID=197700269
http://www.reghardware.co.uk/2007/02/28/amd_690g_launch/
http://www.informationweek.com/news/showAr...n=Breaking+News
http://blogs.zdnet.com/Berlind/?p=363
http://content.zdnet.com/2346-10741_22-57089.html
http://blogs.zdnet.com/Berlind/?p=364

Hehe. Anyone want to guess the clocks? There's enough information to estimate that from the first link alone smile.gif
It's teraflops btw, not terabits.

This is what we know for sure so far.

R6xx/RV6xx common features:
UVD = Universal video decoder, supposedly 100% video decode offload even for h.264
Built in HD audio controller for HDMI output without needing extra cabling (!)
Dual link HDCP content output capable, (G80 can only output HDCP content at single link rates! Don't know if they changed this in the coming G86/G84)

R600:
80nm HS process
Smaller die size than G80
Single chip, unlike G80 which needs the NVIO chip
Built in audio controller(!)
320 scalar unified shaders/ALUs/stream processors
~16 texture units
~800MHz clock speed for top end
512bit memory bus
improved ring bus architecture

RV630:
65nm process
~150mm²
128bit memory bus
120 scalar unified shaders/ALUs/stream processors
~8 texture units

RV610:
65nm process
~80mm²
64bit memory bus (don't know if it can go up to 128bit)
40 scalar unfied shaders
~4 texture units

This post has been edited by ikanayam: Apr 30 2007, 07:55 AM
jcliew
post Mar 1 2007, 03:08 PM

Retired Enthusiast
*****
Senior Member
889 posts

Joined: Oct 2006
From: Johor Bahru


what d difference between teraflops n terabits?
billytong
post Mar 1 2007, 03:27 PM

Lord Sauron
*******
Senior Member
4,522 posts

Joined: Jan 2003
From: Mordor, Middle Earth.


Let me guess R600 it should go somewhere around 640MHz?

This post has been edited by billytong: Mar 1 2007, 03:29 PM
jinaun
post Mar 1 2007, 03:36 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(jcliew @ Mar 1 2007, 03:08 PM)
what d difference between teraflops n terabits?
*
tera flops (FLOPS-Floating Point Operations Per Sec)

tera bits (a unit of binary data, either bandwidth or storage space )

err...
jcliew
post Mar 1 2007, 03:39 PM

Retired Enthusiast
*****
Senior Member
889 posts

Joined: Oct 2006
From: Johor Bahru


QUOTE(jinaun @ Mar 1 2007, 03:36 PM)
tera flops (FLOPS-Floating Point Operations Per Sec)

tera bits (a unit of binary data, either bandwidth or storage space )

err...
*
So teraflops involve enermous amount of data relative to terabits?
TSikanayam
post Mar 1 2007, 03:53 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(jcliew @ Mar 1 2007, 02:39 AM)
So teraflops involve enermous amount of data relative to terabits?
*
No, they are different units of measurement meant to show different things. Anyway, please use google or the QnA section for such questions, i don't want this thread polluted with trivial posts on basic definitions.

This post has been edited by ikanayam: Mar 1 2007, 03:54 PM
empire23
post Mar 1 2007, 05:15 PM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory
QUOTE(jcliew @ Mar 1 2007, 03:39 PM)
So teraflops involve enermous amount of data relative to terabits?
*
Didn't i tell you to stop you goddamned spam fag?!

Please lah, google for mundane questions, and STOP POSTING FOR THE LOVE OF GOD!
taxidoor
post Mar 1 2007, 11:16 PM

Regular
******
Senior Member
1,008 posts

Joined: Mar 2006
From: Kuantan Pahang



shit they keep delay the relase date la.. from feb to may now to june summore... they card not enough power to fight with Nvidia then keep finding solution? Or AMD new proc not rdy so they wait to COMBO attack since they are same BOSS ?
TSikanayam
post Mar 2 2007, 12:03 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

It's 800MHz for 512GFLOPS a piece smile.gif

320ALUs*2FLOPS*800MHz = 512GFLOPS

This post has been edited by ikanayam: Mar 2 2007, 03:37 PM
TSikanayam
post Mar 2 2007, 03:38 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00


QUOTE(taxidoor @ Mar 1 2007, 10:16 AM)
shit they keep delay the relase date la.. from feb to may now to june summore... they card not enough power to fight with Nvidia then keep finding solution? Or AMD new proc not rdy so they wait to COMBO attack since they are same BOSS ?
*
The reason for the final delay is in the very first link. It's strategic, not technical.
QUOTE
Release of the R600 has been delayed "a few weeks" so that AMD can roll out a full suite of graphics chips covering multiple market segments for the latest Microsoft DirectX 10 applications programming interface.

c38y50y70
post Mar 3 2007, 01:07 PM

Getting Started
**
Validating
140 posts

Joined: Dec 2005
From: R&D Center & Home



QUOTE
The company showed a Barcelona-based system using two 200W R600 graphics cards to hit a terabit/second benchmark.
-- EE Times

Graphic card power consumption is going crazier !
TSikanayam
post Mar 3 2007, 01:13 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(c38y50y70 @ Mar 3 2007, 12:07 AM)
Graphic card power consumption is going crazier !
*
G80 is already 180W, so it's not that much worse, depending on how well it performs. I've heard however that things are not going to get any worse than they are now, in fact i think next generation might be better. AMD's upcoming midrange chips RV610, RV630 should be good with power consumption, from all current indications (128W that's been thrown about is wrong AFAIK wink.gif ).
c38y50y70
post Mar 3 2007, 01:42 PM

Getting Started
**
Validating
140 posts

Joined: Dec 2005
From: R&D Center & Home



Yes, these DX10 GPUs are insane. The cooling is much tougher on graphic card than CPU. Well, i guess the developers have no choice if they want great performance. Do you heard of any upcoming ATi GPU with less than 50W?
Goliath764
post Mar 3 2007, 01:50 PM

The Lone Wolf
****
Senior Member
698 posts

Joined: Sep 2006
From: KK, Sabah



I don't care bout the power consumption as long as it perform well. Performance is the first priority to me.
TSikanayam
post Mar 3 2007, 02:17 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(c38y50y70 @ Mar 3 2007, 12:42 AM)
Yes, these DX10 GPUs are insane. The cooling is much tougher on graphic card than CPU. Well, i guess the developers have no choice if they want great performance. Do you heard of any upcoming ATi GPU with less than 50W?
*
GPUs are behind CPUs in terms of power saving features and low power design, for sure. However a lot of that power is also used to power the gobs of high speed memory on the board, CPUs don't have the memory in their power budget. I dunno, i heard many things... tongue.gif


QUOTE(Goliath764 @ Mar 3 2007, 12:50 AM)
I don't care bout the power consumption as long as it perform well. Performance is the first priority to me.
*
Power requirements cannot keep growing at the current rate. You better care because if things were to continue this way you'll have to buy a more powerful air conditioner every time you upgrade. It's bad for the environment, it's wasteful, and it's just plain dumb to keep increasing power requirements like that.

This post has been edited by ikanayam: Mar 3 2007, 02:17 PM
Najmods
post Mar 3 2007, 07:42 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(ikanayam @ Mar 3 2007, 02:17 PM)
Power requirements cannot keep growing at the current rate. You better care because if things were to continue this way you'll have to buy a more powerful air conditioner every time you upgrade. It's bad for the environment, it's wasteful, and it's just plain dumb to keep increasing power requirements like that.
*
Believe it, it will kept growing, and you can't simply compare to CPU. All those pixel processor, stream processor and 1GB of memory sucking tons of power. Its inevitable and unavoidable, you want ultimate power, you have to make ultimate sacrifice (new PSU, more sophisticated cooling etc). No matter how much noise we make about this issue, they will simply ignore it and still release this beast. Buy midrange cards if you cant keep up with the heat, or get out of nuclear reactor

I still remember the good old days, how simple past high-end card was, especially the GF4 Ti-4600, great speed and simple single slot cooling, and midrange GF4 Ti-4200 can simply be overclocked to Ti-4600. I don't believe future high end card gonna be the same
c38y50y70
post Mar 3 2007, 08:14 PM

Getting Started
**
Validating
140 posts

Joined: Dec 2005
From: R&D Center & Home



QUOTE(ikanayam @ Mar 3 2007, 03:17 PM)
However a lot of that power is also used to power the gobs of high speed memory on the board, CPUs don't have the memory in their power budget. I dunno, i heard many things... tongue.gif
*
Yeah, those memory takes a lot of power actually. 1x16MB GDDR3 1000MHz Chip probably consume ~2W. 8 of them (128MB GDDR3) will have ~16W already, hahaha.
Now the 8800GTS with 320MB GDDR is running at even higher speed. The RAM chips alone took up more than 50W hmm.gif

QUOTE(Najmods)
Believe it, it will kept growing, and you can't simply compare to CPU. All those pixel processor, stream processor and 1GB of memory sucking tons of power. Its inevitable and unavoidable, you want ultimate power, you have to make ultimate sacrifice (new PSU, more sophisticated cooling etc). No matter how much noise we make about this issue, they will simply ignore it and still release this beast. Buy midrange cards if you cant keep up with the heat, or get out of nuclear reactor

Hopefully the next generation graphic card can stay at 150W range, just like CPU nowadays. A good hardware designer should watch for power consumption and silicon size, not just simply adding buffers and increasing bandwidth.
Hornet
post Mar 3 2007, 08:22 PM

What?
*******
Senior Member
4,251 posts

Joined: Jan 2003
From: Malacca, Malaysia, Earth


QUOTE(Najmods @ Mar 3 2007, 07:42 PM)
Believe it, it will kept growing, and you can't simply compare to CPU. All those pixel processor, stream processor and 1GB of memory sucking tons of power. Its inevitable and unavoidable, you want ultimate power, you have to make ultimate sacrifice (new PSU, more sophisticated cooling etc). No matter how much noise we make about this issue, they will simply ignore it and still release this beast. Buy midrange cards if you cant keep up with the heat, or get out of nuclear reactor

I still remember the good old days, how simple past high-end card was, especially the GF4 Ti-4600, great speed and simple single slot cooling, and midrange GF4 Ti-4200 can simply be overclocked to Ti-4600. I don't believe future high end card gonna be the same
*
Practically, that cannot happens. I mean yes, requirement will go up, but eventually engineers will have to find a solution to it, it cannot keep growing no holds barred. engineers will not ignore it because if your competitors can come up with one that doesn't need a nuclear reactor power supply and a vacuume for cooling, you're heading for a doom. Remember how 5800Ultra was blasted for its loud cooling solution.

This is what designing new hardware architecture is all about. Everything including power consumption have to be taken into account. It all sums up to efficiency. Growing in performance with a equal growth in power requirement will means there's absolutely no improvement in its efficiency.
TSikanayam
post Mar 16 2007, 07:50 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

For those who didn't already know...
R600 (and RV6xx) is a sound card too. Built in HD audio controller. Very many interesting possibilities may be enabled by this... wink.gif
http://www.beyond3d.com/content/news/127

I don't understand German, but the slides are in english. UVD ftw.
http://www.k-hardware.de/news.php?s=c&news_id=6399
Renovatio
post Mar 16 2007, 09:56 AM

~ Enthusiast low on cash ~
******
Senior Member
1,942 posts

Joined: Nov 2005
From: Penang


QUOTE(ikanayam @ Mar 16 2007, 07:50 AM)
For those who didn't already know...
R600 (and RV6xx) is a sound card too. Built in HD audio controller. Very many interesting possibilities may be enabled by this... wink.gif
http://www.beyond3d.com/content/news/127

*
Now that's definitely new tongue.gif I can't imagine how much noise it will actually introduced to the port (with all the vibration and big fat fan running) hehehe or they might really do a great job and pull it through with this new innovation. Definitely worth a wait.
TSikanayam
post Mar 16 2007, 10:11 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(Renovatio @ Mar 15 2007, 08:56 PM)
Now that's definitely new tongue.gif I can't imagine how much noise it will actually introduced to the port (with all the vibration and big fat fan running) hehehe or they might really do a great job and pull it through with this new innovation. Definitely worth a wait.
*
Electronic noise? On the audio line? The output is digital (HDMI/SPDIF). Unless there's something REALLY broken, noise is not an issue on a digital output.

Hehe, there are many other interesting possibilities that may come with that. Hardware accelerated audio on your GPU perhaps.... who knows....


This post has been edited by ikanayam: Mar 17 2007, 05:39 AM
TSikanayam
post Mar 17 2007, 05:40 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

RV610 die size:
http://www.beyond3d.com/content/news/129

RV630 die size:
http://www.beyond3d.com/content/news/130

Hehe. RV630 is most interesting.
TSikanayam
post Apr 3 2007, 01:45 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

http://www.pcgameshardware.de/?article_id=573129
Yay, finally someone from AMD confirms publicly that R600 is 80nm, thus putting the ridiculous rumors about 65nm "respins" to rest. Not that there was any doubt about this in the first place... lol

And just to reiterate since people are still hung up on the OEM board pictures, the retail version of R600 will not be 12" long. It will be shorter than the 8800GTX as far as i know, about 9.5" long.

First post updated with what is known so far, feel free to suggest if there's something i missed.

This post has been edited by ikanayam: Apr 3 2007, 02:05 AM
Najmods
post Apr 3 2007, 02:39 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(ikanayam @ Apr 3 2007, 01:45 AM)
http://www.pcgameshardware.de/?article_id=573129
Yay, finally someone from AMD confirms publicly that R600 is 80nm, thus putting the ridiculous rumors about 65nm "respins" to rest. Not that there was any doubt about this in the first place... lol

And just to reiterate since people are still hung up on the OEM board pictures, the retail version of R600 will not be 12" long. It will be shorter than the 8800GTX as far as i know, about 9.5" long.

First post updated with what is known so far, feel free to suggest if there's something i missed.
*
No thats good enough, thanks for the link smile.gif I don't think you miss anything

For the people who say R600 is too long should be kill on sight
hyyam85
post Apr 3 2007, 12:10 PM

On my way
****
Senior Member
510 posts

Joined: Sep 2005


how is the power comsumption like? hopefully i will hav a powerful enough PSU to power it
g5sim
post Apr 6 2007, 05:24 AM

Look at all my stars!!
*******
Senior Member
5,757 posts

Joined: Jan 2003
From: Sri Kembangan


OMG .. R6xxx cards to have built in HD sound !!

http://www.theinquirer.net/default.aspx?article=38714


R600 HAS A secret weapon, an internal sound card. This is the one thing that Nvidia's G8x can't match, other than HDCP on dual-link HDMI.
The ATI sound implementation is not GPGPU code. It is dedicated silicon, probably brought on by the Vista DRM infection and MS twisting arms to force it on people.

In any case, R600 will be compliant with the Vista requirements and can send sound directly over a HDCP/HDMI link. We are told this is a full HD sound setup, not a cheesy 2.1 channel thing.

In contrast, NV G8x parts can't do this. They have to run an external cable from the sound chip to the GPU. This may not sound like much but it blows out several kinds of auto configuration and worse yet violates Vista logo requirements. One has to wonder if this is why NV can't seem to make a functional Vista driver six months in.

The problem with Vista is that the DRM infection mandates that you do not share S/PDIF output over unencrypted links. R600 does this by combining audio and video streams, then pumping them out over HDCP infected links. This is user antagonistic DRM, but it complies with MS logo requirements, and they don't care about user experiences any more than the content mafiaa.

Add in that the R600 can do dual-link HDCP and you are going to be swimming in bandwidth, more than enough to pipe sound down.

Nvidia's G8x on the other hand can't do dual link HDCP at all, so if you have a 30-inch monitor, you will get a black screen. At that point, sound is the least of your problems.

Basically it looks like the sound card in R600 is going to be the killer app for home theatre type apps. G8x simply can not do what is needed here, buggy drivers or not. While the DRM infection stinks, at least R600 will be able to comply.

TSikanayam
post Apr 6 2007, 06:11 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

I don't see anything in there that has not been in the first post of this thread for at least the past 3 days. Except the usual inq nonsense.
LExus65
post Apr 6 2007, 09:56 AM

Old Gezzer.....
******
Senior Member
1,995 posts

Joined: May 2005


why they still limit r630 to 128 bit............... aiyah..............
c38y50y70
post Apr 6 2007, 09:59 AM

Getting Started
**
Validating
140 posts

Joined: Dec 2005
From: R&D Center & Home



to keep the cost affordable for mid range consumer.
TSikanayam
post Apr 6 2007, 11:07 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(LExus65 @ Apr 5 2007, 08:56 PM)
why they still limit r630 to 128 bit............... aiyah..............
*
Yes. Memory bandwidth is a function of bus width AND clock speeds, among other things.


billytong
post Apr 6 2007, 04:17 PM

Lord Sauron
*******
Senior Member
4,522 posts

Joined: Jan 2003
From: Mordor, Middle Earth.


QUOTE(LExus65 @ Apr 6 2007, 09:56 AM)
why they still limit r630 to 128 bit............... aiyah..............
*

With the high speed of GDDR3/GDDR4. 128bit bus width for midrange graphics card should suffice.
zeustronic
post Apr 6 2007, 08:10 PM

Retire OC Into Audiophiles
*******
Senior Member
2,605 posts

Joined: Jan 2006
From: A Place Between Heaven & Hell


Delete Post

This post has been edited by zeustronic: Apr 6 2007, 08:15 PM
Hyde`fK
post Apr 6 2007, 08:14 PM

D9s Killer
*******
Senior Member
2,378 posts

Joined: Jan 2003
From: Miri,Sarawak,Malaysia Status: Dead!



Here's the video


pohpiah
post Apr 14 2007, 10:54 AM

Getting Started
**
Junior Member
201 posts

Joined: May 2006


I want to ask, any hint of the budget and midrange prices for these cards? will it be around R500 cards, what is the fate price wise of last gen cards? (based on previous new generation roll out)
TSikanayam
post Apr 14 2007, 11:51 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

Honestly, i don't know for certain about pricing at this point (these things can change quite rapidly depending on conditions), but look at the previous generation midrange prices and you should be able to get a decent idea. If i knew for certain, it would be up there in the first post. I do not put anything that is "rumored" in there. All that information there is confirmed to be true unless i state otherwise.

This post has been edited by ikanayam: Apr 14 2007, 11:51 AM
g5sim
post Apr 15 2007, 04:49 AM

Look at all my stars!!
*******
Senior Member
5,757 posts

Joined: Jan 2003
From: Sri Kembangan


more nonsense that i like to read from the inquirer biggrin.gif

Five models at launch, HD Audio and HDMI supported all the way

http://www.theinquirer.net/default.aspx?article=38923

By Theo Valich: Saturday 14 April 2007, 08:32

AMD MADE a mess of its own naming conventions, because it has already launched rebranded RV510 and RV530/560 products as Radeon X2300
. So it decided to ditch the "X" prefix, which was introduced to mark the introduction of the PCI Express standard, even though many cards were shipped with an AGP interface.

The HD models really have a reason to be called that, since from RV610LE chip to R600 boards, the HD Audio codec is present, and HDMI support is native. However, there are some video processing differences, such as a repeat of RV510/530 vs. R580 scenario, albeit with a different GPU mix.

First of all, we have received a lot of e-mails from readers asking us about HDMI support and how that's done, since leaked pictures of OEM designs do not come with HDMI connectors, rather dual-link DVIs only. Again, we need to remind you of our R600 HDMI story, and that story is about elementary maths.

A single dual-link DVI connector has enough bandwidth to stream both video and audio in 1280x720 and 1920x1080 onto HDMI interface by using a dongle. On lower-end models, it is possible that the dongle will be skipped and that a direct HDMI connector will be placed onto the bracket.

The RV610LE will be known to the world as Radeon HD 2400 Pro, and will support 720p HD playback. If you want 1080p HD playback, you have to get a faster performing part. In addition, the HD2400Pro will be paired with DDR2 memory only, the very same chips many of enthusiasts use as their system memory - DDR2-800 or PC2-6400, 800 MHz memory. The reason for the 720p limitation is very simple - this chip heads against G86-303 chip with 64-bit memory interface, the 8300 series. It goes without saying that the RV610LE is a 64-bit chip as well.

The second in line is RV610Pro chip, branded Radeon HD 2400 XT. This pup is paired with GDDR3 memory and is the first chip able to playback Full HD video (1920x1080), thanks to the fact that this is a fully-fledged RV610 GPU, no ultra-cheap-64-bit-only-PCB. RV630 Pro is an interesting one. Formally named Radeon HD 2600 Pro, it sports the very same DDR2 memory used on HD 2400Pro and GeForce 8500GTs we have, but there are some memory controller differences that will be revealed to you as soon as we get permission to put the pictures online.

The RV630XT - the Radeon HD 2600 XT - is nearly identical to the Pro version when it comes to the GPU, but this board is a monster when it comes to memory support, just like its predecessor, the X1600XT. However, this is the only product in the whole launch day line-up that has support for both GDDR-3 and GDDR-4 memory types. Both GDDR-3 and GDDR-4 memory will end up clocked to heavens high, meaning the excellent 8600GTS will have a fearsome competitor.

The R600 512MB is the grand finale. R600 is HD 2900 XT, as our Wily already disclosed. This board packs 512MB of GDDR-3 memory from Samsung, and offers same or a little better performance than 8800GTX, at a price point of 8800GTS. R600 GPU supports two independent video streams, so even a dual-link DVI can be done, even though we doubt this was high on AMD's priority list. This product is nine months late, and a refresh is around the corner, unless AMD continues to execute as ATi did.

The R600 1GB is very interesting. Originally, we heard about this product as a GDDR-4 only, and it is supposed to launch on Computex. We heard more details, and now you need to order at least 100 cards to get it, it will be available in limited quantities only. We expect that ATi will refrain from introduction until a dual-die product from nV shows up, so that AMD can offer CrossFire version with 2GB of video memory in total, for the same price as Nvidia's 1.5GB. Then again, in the war of video memory numbers, AMD is now losing to Nvidia flat-out.

AMD compromised its own product line-up with this 512MB card being the launch one, and no amount of marketing papers and powerpointery can negate the fact that AMD is nine months late and has 256MB of memory less than a six month old flagship product from the competitor. µ


http://www.dailytech.com/article.aspx?newsid=6903

320-stream processors, named ATI Radeon HD 2900

AMD has named the rest of its upcoming ATI Radeon DirectX 10 product lineup. The new DirectX 10 product family received the ATI Radeon HD 2000-series moniker. For the new product generation, AMD has tagged HD to the product name to designate the entire lineup's Avivo HD technology. AMD has also removed the X-prefix on its product models.

At the top of the DirectX 10 chain, is the ATI Radeon HD 2900 XT. The AMD ATI Radeon HD 2900-series features 320 stream processors, over twice as many as NVIDIA's GeForce 8800 GTX. AMD couples the 320 stream processors with a 512-bit memory interface with eight channels. CrossFire support is now natively supported by the AMD ATI Radeon HD 2900-series; the external CrossFire dongle is a thing of the past.

The R600-based ATI Radeon HD 2900-series products also support 128-bit HDR rendering. AMD has also upped the ante on anti-aliasing support. The ATI Radeon HD 2900-series supports up to 24x anti-aliasing. NVIDIA's GeForce 8800-series only supports up to 16x anti-aliasing. AMD's ATI Radeon HD 2900-series also possesses physics processing.

New to the ATI Radeon HD 2900-series are integrated HDMI output capabilities with 5.1 surround sound. However, early images of AMD's OEM R600 reveal dual dual-link DVI outputs, rendering the audio functions useless.

AMD's RV630-based products will carry the ATI Radeon HD 2600 moniker with Pro and XT models. The value-targeted RV610-based products will carry the ATI Radeon HD 2400 name with Pro and XT models as well.

The entire AMD ATI Radeon HD 2000-family features the latest Avivo HD technology. AMD's upgraded Avivo with a new Universal Video Decoder, also known as UVD, and the new Advanced Video Processor, or AVP. UVD previously made its debut in the OEM-exclusive RV550 GPU core. UVD provides hardware acceleration of H.264 and VC-1 high definition video formats used by Blu-ray and HD DVD. The AVP allows the GPU to apply hardware acceleration and video processing functions while keeping power consumption low.

Expect AMD to launch the ATI Radeon HD 2000-family in the upcoming weeks, if AMD doesn't push back the launch dates further.



http://hardware.gotfrag.com/portal/story/37293/

Via DailyTech: "AMD has named the rest of its upcoming ATI Radeon DirectX 10 product lineup. The new DirectX 10 product family received the ATI Radeon HD 2000-series moniker. For the new product generation, AMD has tagged HD to the product name to designate the entire lineup's Avivo HD technology. AMD has also removed the X-prefix on its product models. At the top of the DirectX 10 chain, is the ATI Radeon HD 2900 XT. The AMD ATI Radeon HD 2900-series features 320 stream processors, over twice as many as NVIDIA's GeForce 8800 GTX. AMD couples the 320 stream processors with a 512-bit memory interface with eight channels. CrossFire support is now natively supported by the AMD ATI Radeon HD 2900-series; the external CrossFire dongle is a thing of the past.
The R600-based ATI Radeon HD 2900-series products also support 128-bit HDR rendering. AMD has also upped the ante on anti-aliasing support. The ATI Radeon HD 2900-series supports up to 24x anti-aliasing. NVIDIA's GeForce 8800-series only supports up to 16x anti-aliasing. AMD's ATI Radeon HD 2900-series also possesses physics processing."




arjuna_mfna
post Apr 15 2007, 04:15 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



when the real relase date the HD2X00 series
g5sim
post Apr 16 2007, 07:23 AM

Look at all my stars!!
*******
Senior Member
5,757 posts

Joined: Jan 2003
From: Sri Kembangan


QUOTE(arjuna_mfna @ Apr 15 2007, 04:15 PM)
when the real relase date the HD2X00 series
*
these coming weeks tongue.gif
Chanwsan
post Apr 16 2007, 12:43 PM

सोहम
******
Senior Member
1,406 posts

Joined: Dec 2004
From: Living Hell


well this is good news, i cant wait for nViDia GCs to drop their price and start the price war after the R6xx cards is released
arjuna_mfna
post Apr 16 2007, 05:18 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(g5sim @ Apr 16 2007, 07:23 AM)
these coming weeks  tongue.gif
*
thanks bro, that a good news, waiting so long for that day... wanna see how this new baby perform...
Radeon
post Apr 16 2007, 06:37 PM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

but honestly i can't believe what inquirer says....
TSikanayam
post Apr 18 2007, 02:11 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

Confirmed: The top end RV630 part does not have a power connector. Meaning it will be safely under 75W. I cannot wait.
g5sim
post Apr 18 2007, 09:23 PM

Look at all my stars!!
*******
Senior Member
5,757 posts

Joined: Jan 2003
From: Sri Kembangan


QUOTE(ikanayam @ Apr 18 2007, 02:11 PM)
Confirmed: The top end RV630 part does not have a power connector. Meaning it will be safely under 75W. I cannot wait.
*
shocking.gif my 9800 pro needs power connector doh.gif doh.gif doh.gif guess the new once are not power hungry as the old cards biggrin.gif
c38y50y70
post Apr 18 2007, 09:43 PM

Getting Started
**
Validating
140 posts

Joined: Dec 2005
From: R&D Center & Home



your 9800Pro is using AGP, which is able to supply about 50W only. Even the graphic card runs at 55W it still require an external power connector.
g5sim
post Apr 19 2007, 12:05 PM

Look at all my stars!!
*******
Senior Member
5,757 posts

Joined: Jan 2003
From: Sri Kembangan


QUOTE(c38y50y70 @ Apr 18 2007, 09:43 PM)
your 9800Pro is using AGP, which is able to supply about 50W only. Even the graphic card runs at 55W it still require an external power connector.
*
oh i see. i didnt know that there is a power restriction on the AGP slot blush.gif
- L e O -
post Apr 19 2007, 04:18 PM

No Music No Life
******
Senior Member
1,995 posts

Joined: Jan 2003
From: Subang Jaya
yeah i think pci-e able to supply 75W only
prospeed_ballz
post Apr 19 2007, 08:06 PM

Casual
***
Junior Member
347 posts

Joined: Apr 2007
From: ipoh mali



this might sound stupid......
what is R600........
is it a chipset for graphic........
why some people waiting for this chip.........
Joseph Hahn
post Apr 19 2007, 10:00 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
doh.gif Yes it's the latest high end graphic offering from ATi, but it's not out yet. Maybe next week or so... hopefully.
jinaun
post Apr 19 2007, 11:58 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(prospeed_ballz @ Apr 19 2007, 08:06 PM)
this might sound stupid......
what is R600........
is it a chipset for graphic........
why some people waiting for this chip.........
*
when people say R600.. it means the whole card which utilized the R600 core
prospeed_ballz
post Apr 20 2007, 02:00 AM

Casual
***
Junior Member
347 posts

Joined: Apr 2007
From: ipoh mali



QUOTE(jinaun @ Apr 19 2007, 11:58 PM)
when people say R600.. it means the whole card which utilized the R600 core
*
so this kind of chipset is better from Nvidia chipset......
dx10 ready......?
hows da price.....?
if lower than 8800GTX than i might wait for it also....


Added on April 20, 2007, 2:00 am
QUOTE(jinaun @ Apr 19 2007, 11:58 PM)
when people say R600.. it means the whole card which utilized the R600 core
*
so this kind of chipset is better from Nvidia chipset......
dx10 ready......?
hows da price.....?
if lower than 8800GTX than i might wait for it also....

This post has been edited by prospeed_ballz: Apr 20 2007, 02:00 AM
g5sim
post Apr 20 2007, 05:15 AM

Look at all my stars!!
*******
Senior Member
5,757 posts

Joined: Jan 2003
From: Sri Kembangan


QUOTE(prospeed_ballz @ Apr 20 2007, 02:00 AM)
so this kind of chipset is better from Nvidia chipset......
dx10 ready......?
hows da price.....?
if lower than 8800GTX than i might wait for it also....
*
hmm requote:
tongue.gif
QUOTE
At the top of the DirectX 10 chain, is the ATI Radeon HD 2900 XT. The AMD ATI Radeon HD 2900-series features 320 stream processors, over twice as many as NVIDIA's GeForce 8800 GTX. AMD couples the 320 stream processors with a 512-bit memory interface with eight channels. CrossFire support is now natively supported by the AMD ATI Radeon HD 2900-series; the external CrossFire dongle is a thing of the past.

The R600-based ATI Radeon HD 2900-series products also support 128-bit HDR rendering. AMD has also upped the ante on anti-aliasing support. The ATI Radeon HD 2900-series supports up to 24x anti-aliasing. NVIDIA's GeForce 8800-series only supports up to 16x anti-aliasing. AMD's ATI Radeon HD 2900-series also possesses physics processing.

New to the ATI Radeon HD 2900-series are integrated HDMI output capabilities with 5.1 surround sound.

TSikanayam
post Apr 20 2007, 11:43 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

Finally, some proper pictures of RV630:
http://www.hardspell.com/english/doc/showc...=414&pageid=516

I don't know about the benchmark figures and all that, but that picture is legit.
accs_centre
post Apr 20 2007, 12:17 PM

Look at all my stars!!
*******
Senior Member
4,840 posts

Joined: Jan 2003
From: Universal

Is it we can expect HD2900xt selling in the Market on 14 May 07?
ruffstuff
post Apr 20 2007, 12:35 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Radeon @ Apr 16 2007, 06:37 PM)
but honestly i can't believe what inquirer says....
*
I really want to believe when they said the performance of 8800gtx with the price of 8800gts. laugh.gif
lamely_named
post Apr 20 2007, 02:25 PM

I got younger. ROLLZ.
******
Senior Member
1,931 posts

Joined: Jan 2003
From: Human Mixbreeding Farm
QUOTE(ikanayam @ Apr 18 2007, 02:11 PM)
Confirmed: The top end RV630 part does not have a power connector. Meaning it will be safely under 75W. I cannot wait.
*
wowowowoweeee omgohzorz haxorz.

Finally, them power hungry wh*re are joining the nunnery.

I was getting worried that I might need to dangle molex connectors all over these new cards like it's somekind of eight legged spider.

hope Nvidia will do the same.

I knew there has got to be a better solution, they are just not looking hard enough for it, simply adding the jiggawatts and hulking the size of the GC is not the solution.

In the future, I would like to see USB Graphic card the shape/size of a pendrive. So we can dangle our 999999gtx around our neck at LAN party and pimp them with LED and laser etching. For cooling, we simply put them inside a water cooled copper cube box, with the USB connector end sticking out. So your rig's front panel will have a fist sized copper cube attached, complete with external water pump. You can pimp the copper cube to any shape you want, micky mouse, shape of a women, dildo, etc. Just dont touch it, coz it's hot.

Oh, Look at my usb 99999gtx thumb SLI, around my gold necklace, next to my brass knuckle .... and KEEP YOUR HANDS OFF MY GIRL!!!



This post has been edited by lamely_named: Apr 20 2007, 02:37 PM
g5sim
post Apr 22 2007, 06:17 AM

Look at all my stars!!
*******
Senior Member
5,757 posts

Joined: Jan 2003
From: Sri Kembangan


QUOTE(lamely_named @ Apr 20 2007, 02:25 PM)
wowowowoweeee omgohzorz haxorz.

Finally, them power hungry wh*re are joining the nunnery.

I was getting worried that I might need to dangle molex connectors all over these new cards like it's somekind of eight legged spider.

hope Nvidia will do the same.

I knew there has got to be a better solution, they are just not looking hard enough for it, simply adding the jiggawatts and hulking the size of the GC is not the solution.

In the future, I would like to see USB Graphic card the shape/size of a pendrive. So we can dangle our 999999gtx around our neck at LAN party and pimp them with LED and laser etching. For cooling, we simply put them inside a water cooled copper cube box, with the USB connector end sticking out. So your rig's front panel will have a fist sized copper cube attached, complete with external water pump. You can pimp the copper cube to any shape you want, micky mouse, shape of a women, dildo, etc. Just dont touch it, coz it's hot.

Oh, Look at my usb 99999gtx thumb SLI, around my gold necklace, next to my brass knuckle .... and KEEP YOUR HANDS OFF MY GIRL!!!
*
drunk again as usual lamely_named? rclxm9.gif minum banyak gilak lankau laugh.gif

akachester
post Apr 22 2007, 08:48 AM

Its Life. Live with it!
*******
Senior Member
7,689 posts

Joined: Jul 2005
From: The Land of No Return


QUOTE(ikanayam @ Apr 20 2007, 11:43 AM)
Finally, some proper pictures of RV630:
http://www.hardspell.com/english/doc/showc...=414&pageid=516

I don't know about the benchmark figures and all that, but that picture is legit.
*
Wow, looks very nice..Cant wait for it to be released... thumbup.gif
Radeon
post Apr 22 2007, 10:20 AM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

QUOTE(ikanayam @ Apr 18 2007, 02:11 PM)
Confirmed: The top end RV630 part does not have a power connector. Meaning it will be safely under 75W. I cannot wait.
*
cool2.gif
arent you going for the R rather than the RV?
akachester
post Apr 22 2007, 10:21 AM

Its Life. Live with it!
*******
Senior Member
7,689 posts

Joined: Jul 2005
From: The Land of No Return


Considering that the benchmark up there is real, what will be the challenger for the 2600XT compared to NVidia?8600GTS?
arjuna_mfna
post Apr 22 2007, 11:00 AM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



the HD2600XT priced at usd199 rite? it should be round RM700-850 price like 8600GTS..

can this baby fight the 8800gts
Najmods
post Apr 22 2007, 11:31 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(arjuna_mfna @ Apr 22 2007, 11:00 AM)
the HD2600XT priced at usd199 rite? it should be round RM700-850 price like 8600GTS..

can this baby fight the 8800gts
*
Nobody knows this yet, but I believe it wont be as fast as 8800GTS since its not targeted to compete with it, probably R600GTO with cut down pipes and memory will beat 8800GTS
Joseph Hahn
post Apr 22 2007, 02:38 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
QUOTE
Culver City (CA) - At around this time tomorrow, we'll be in the air on our way to Tunisia Africa for AMD's R600 media event.  AMD is sparing no expense for this press junket and, according to an AMD rep, have basically reserved the entire Sheraton Tunis Hotel for hundreds of media from around the world and "few dozen" AMD executives and other employees.

AMD will be showing off their next-generation R600 graphics card for both desktops and mobile platforms.  Most of the journalists should land sometime on April 22nd and the actual briefings will occur on the 23rd and 24th.

Unfortunately, you'll have to wait for all the juicy technical details because the reporters have signed a non-disclosure agreement that forbids them to talk for several weeks.  Of course that could prove difficult with hundreds of reporters in one spot.  Maybe AMD will sic the lions and cheetahs on reporters who get out of line. laugh.gif

The R600 launch comes at a crucial point in AMD's history because it's hemorrhaging money.  The chip maker reported a $611 million dollar quarterly loss yesterday and this launch could help turn things around.

The actual launch of the card will happen late May early June. rclxub.gif 

http://www.tgdaily.com/index.php?option=co...1722&Itemid=118
ruffstuff
post Apr 22 2007, 03:00 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
Is it R600 too late or G80 was too early?
Eoma
post Apr 22 2007, 03:05 PM

- ,. -
Group Icon
Elite
4,603 posts

Joined: Jan 2003
From: PJ


Late May early June ? If it's true, forget it then lar. G80 here i come....
TSikanayam
post Apr 22 2007, 05:47 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(Radeon @ Apr 21 2007, 09:20 PM)
cool2.gif
arent you going for the R rather than the RV?
*
I don't necessarily buy something even if i find it interesting. It's just that it will be a great performance/watt part, and i have always been hoping for a large improvement in this area. Seems like i'll finally see it happen with the RV6xx series.
Suk
post Apr 22 2007, 11:46 PM

Look at all my stars!!
*******
Senior Member
2,330 posts

Joined: Jan 2003
From: 192.168.1.2



Today we saw FX57.net disclosed the R600 3DMark06 test results in VISTA.

Test platform:

Processor: Intel Core 2 Extreme X6800 2.93GHz
Mainboard: Intel i975X
Memory: DDR2-800 2GB
OS: Windows Vista
Driver: Catalyst 7.1(8.33), ForceWare 97.44

The score of R600XTX 1GB is about 12000 while NVIDIA 8800GTX 10500. R600XTX is higher than 8800GTX about 15%. R600XT reached 10000 but 8800GTS is 9000.



http://www.hardspell.com/english/doc/showc...asp?news_id=361
RokXIII
post Apr 23 2007, 12:58 AM

C'est la vie, Chérie
******
Senior Member
1,634 posts

Joined: Mar 2006
From: Ipoh @ Puchong


QUOTE(ruffstuff @ Apr 22 2007, 03:00 PM)
Is it R600 too late or G80 was too early?
*
i would say that R600 is too late... Even 86xx series already release but still no good news from R600, just delay.... sad.gif
kucalana
post Apr 23 2007, 01:34 AM

Getting Started
**
Junior Member
151 posts

Joined: Dec 2005


the game not coming out yet.it better wait the game first and see which one is good(nv@ati)
afosz
post Apr 23 2007, 08:25 AM

Justice, My Foot!
******
Senior Member
1,413 posts

Joined: Jun 2006
From: Shah Alam
Yup, I believe ATI wait for new games with more demanding graphic to do comparison. G80 early release is just to grab more buyer tongue.gif
seanl
post Apr 23 2007, 01:55 PM

Enthusiast
*****
Senior Member
734 posts

Joined: Feb 2006
From: Selangor


QUOTE(afosz @ Apr 23 2007, 08:25 AM)
G80 early release is just to grab more buyer tongue.gif
*
which they did so very successfully, much like xbox over PS3.....early bird catches a whole lot more of worms....

This post has been edited by seanl: Apr 23 2007, 04:17 PM
arjuna_mfna
post Apr 23 2007, 03:38 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(accs_centre @ Apr 20 2007, 12:17 PM)
Is it we can expect HD2900xt selling in the Market on 14 May 07?
*
how come u can say that, still don't hv any news from amd... wo why they took so long
sniperwolf
post Apr 23 2007, 06:39 PM

Boku Wa Gundam
*******
Senior Member
2,810 posts

Joined: Jan 2003
From: Bangsar
Photo of the HD 2600xt ddr3 version. From the article show it doesnt require external power

user posted image

user posted image

More details is available HERE

Some sample benchmark result of the HD 2600xt as compare to 8600gts available HERE

NFSC main menu frame rates
user posted image

This post has been edited by sniperwolf: Apr 23 2007, 06:51 PM
akachester
post Apr 23 2007, 07:12 PM

Its Life. Live with it!
*******
Senior Member
7,689 posts

Joined: Jul 2005
From: The Land of No Return


The 2600XT is GDDR4? I saw that on the link given by sniperwolf..And it should be on par with the 8600GTS..

This post has been edited by akachester: Apr 23 2007, 07:13 PM
dos
post Apr 23 2007, 11:44 PM

Getting Started
**
Junior Member
188 posts

Joined: May 2006
It should beat the 8600gts lah. The 8600 series not as great a leap as expected.

Ok is the article saying 3dmark, 2600-2105, and 20 fps faster than gts in need for speed???
realone
post Apr 24 2007, 12:41 AM

Enthusiast
*****
Senior Member
865 posts

Joined: Jan 2003


user posted image

source: www.fudzilla.com


lets party drool.gif drool.gif
kucalana
post Apr 24 2007, 01:17 AM

Getting Started
**
Junior Member
151 posts

Joined: Dec 2005


nice cooling there....
is it enough for the fan to cool all of this copper.not enough air flow will make it more heat i think.
anyway...nice thumbup.gif
johnnycp
post Apr 24 2007, 01:30 AM

On my way
****
Junior Member
600 posts

Joined: May 2005
From: Sabah
QUOTE(sniperwolf @ Apr 23 2007, 06:39 PM)
» Click to show Spoiler - click again to hide... «
I cannot trust these pics
Notice the differences??

user posted image


TSikanayam
post Apr 24 2007, 01:33 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

2nd picture is the newer revision of the RV630xt board.
edwin3210
post Apr 24 2007, 01:37 AM

lll
*****
Senior Member
808 posts

Joined: Jan 2007
the highest end of the R600 family consume 270W, wtf.....it really need super cool and quiet to save power consumption.
ruffstuff
post Apr 24 2007, 01:38 AM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
The first picture is obvious shoop. Look how wrong is the focus between the motherboard and the card itself.
seanl
post Apr 24 2007, 01:50 AM

Enthusiast
*****
Senior Member
734 posts

Joined: Feb 2006
From: Selangor


QUOTE(ruffstuff @ Apr 24 2007, 01:38 AM)
The first picture is obvious shoop.  Look how wrong is the focus between the motherboard and the card itself.
*
i think that's depth of field effect of the camera...
TSikanayam
post Apr 24 2007, 01:51 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(ruffstuff @ Apr 23 2007, 12:38 PM)
The first picture is obvious shoop.  Look how wrong is the focus between the motherboard and the card itself.
*
It's an early RV630xt board. Still has the power connector. It's not a shop.

This post has been edited by ikanayam: Apr 24 2007, 01:51 AM
empire23
post Apr 24 2007, 01:55 AM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory
QUOTE(ruffstuff @ Apr 24 2007, 01:38 AM)
The first picture is obvious shoop.  Look how wrong is the focus between the motherboard and the card itself.
*
Not all cameras ade super apeture bang tongue.gif
ruffstuff
post Apr 24 2007, 01:58 AM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
Lulz, its still a shoop for save as jpg option for the least.
arjuna_mfna
post Apr 24 2007, 09:47 AM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



can't wait this baby jump into market... so long not use ati vga...
badguy86
post Apr 24 2007, 01:42 PM

Getting Started
**
Junior Member
292 posts

Joined: Sep 2006
From: Kuching, Sarawak, Malaysia



R600 Advantages ya? flex.gif
user posted image

mdzaboy
post Apr 24 2007, 05:55 PM

CuChee MunK KuK
*******
Senior Member
2,061 posts

Joined: Jan 2003
From: Jabaronie to Astaka Status: のあ..?



ATI Radeon HD 2900 XT Performance Benchmarks notworthy.gif notworthy.gif

taken from dailytech.com
badguy86
post Apr 24 2007, 06:58 PM

Getting Started
**
Junior Member
292 posts

Joined: Sep 2006
From: Kuching, Sarawak, Malaysia



QUOTE(mdzaboy @ Apr 24 2007, 05:55 PM)
user posted image

Waiseh! Look at the ON Fire graphic cooling, not bad ah. thumbup.gif I like its red color!
dos
post Apr 24 2007, 10:16 PM

Getting Started
**
Junior Member
188 posts

Joined: May 2006
Aw you beat me to it. How do they compare to a gtx?
Joseph Hahn
post Apr 24 2007, 10:52 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
That's XT which is equivalent to GTS. Still waiting for XTX 1GB GDDR4 w00t to surface. hmm.gif

This post has been edited by Joseph Hahn: Apr 24 2007, 10:54 PM
Zicron
post Apr 24 2007, 11:24 PM

Come to the Dark Side
*****
Senior Member
899 posts

Joined: Mar 2005
From: Klang


More benchmark of RV630
http://www.ocer.net/article/a_show.php?id=10057

3Dmark06

Radeon HD 2600XT - 5760 marks
GeForce 8600GTS - 5900 marks
exhauster
post Apr 25 2007, 12:36 AM

Casual
***
Junior Member
427 posts

Joined: Sep 2006
From: KK



follow by this--
AMD's ATI Radeon HD 2900-series also possesses physics processing
is this mean by using it no need buy the asus physix card but will hv its effect ???
bryanyeo87
post Apr 25 2007, 01:19 AM

Below the Belt
*******
Senior Member
3,175 posts

Joined: May 2006
I stumbled upon this while looking up barcelona cores...notice the #1 rank...amd....and unknow model? maybe the engineer take for test drive? LOL

heres the link
http://www.boincstats.com/stats/host_cpu_s...=sah&st=0&or=10
salimbest83
post Apr 25 2007, 01:43 AM

♥PMS on certain day♥
*******
Senior Member
8,647 posts

Joined: Feb 2006
From: Jelutong Penang



2900XTX... thats sound nice....
how bout their price..

max_cjs0101
post Apr 25 2007, 03:23 AM

Tarp hater and detector
Group Icon
Staff
1,368 posts

Joined: Nov 2004
From: A' Ghàidhealtachd


QUOTE(salimbest83 @ Apr 25 2007, 01:43 AM)
2900XTX... thats sound nice....
how bout their price..
*
During launch sure will be abt rm2k+.
BTW,what's with ATI taking such a looong time to launch their products to the consumers? shakehead.gif
Joseph Hahn
post Apr 25 2007, 04:59 AM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
I hope it's around $600 USD which translates to about RM2053.26 right now. XD

This post has been edited by Joseph Hahn: Apr 25 2007, 05:00 AM
salimbest83
post Apr 25 2007, 06:28 AM

♥PMS on certain day♥
*******
Senior Member
8,647 posts

Joined: Feb 2006
From: Jelutong Penang



then how bout power consumption...
did we need new PSU...


since it build on 65nm ...
hope will be less than 8800 series....

TSikanayam
post Apr 25 2007, 06:39 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

Read the first post. R600 will be on 80nm, and all indications point to it being power hungry.
bOy13
post Apr 25 2007, 09:25 AM

New Member
*
Junior Member
35 posts

Joined: Aug 2006
From: Miri



QUOTE(Joseph Hahn @ Apr 25 2007, 04:59 AM)
I hope it's around $600 USD which translates to about RM2053.26 right now. XD
*
if it is really about RM2053.26 for such a good card, still a acceptable price.. wink.gif
Eoma
post Apr 25 2007, 09:32 AM

- ,. -
Group Icon
Elite
4,603 posts

Joined: Jan 2003
From: PJ


ikanayam:

In your professional opinion, is R600 worth the wait, will it bring more to the table ? Gemoetry shaders etc, DX 10.1 (?). Disregarding any price drops on the GTX when the R600 is released, is it still a good idea to get a GTX ?
TSikanayam
post Apr 25 2007, 09:58 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

^ Even if i did know, i cannot tell you right now. But no, i don't know. tongue.gif

It's coming soon, so i would just wait. The press is in Tunis already for the briefing.
Joseph Hahn
post Apr 25 2007, 12:27 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
QUOTE(bOy13 @ Apr 25 2007, 09:25 AM)
if it is really about RM2053.26 for such a good card, still a acceptable price.. wink.gif
*
Nah that's just my speculation and direct currency converting in google anyway. More realistic price should be RM2200-2500. But of course cheaper is better. XD


In the meantime...
http://www.dailytech.com/Overclocking+the+...article7044.htm
QUOTE
I'm currently benchmarking the Radeon HD 2900 XTX, though I'll revise the XT if anyone has any particular requests.
Oh shi-, can't wait. drool.gif
Radeon
post Apr 25 2007, 02:59 PM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

hope this one is real not the level 505 fluke
dos
post Apr 25 2007, 06:02 PM

Getting Started
**
Junior Member
188 posts

Joined: May 2006
It should be real. Those guys are at tunisia event.

Game settings for yesterday's dailytech benchmark.

QUOTE
The quality settings for the games were as follows:

Call of Duty 2 - Anisotropic filtering, 4xAA (in game), V-Sync off, Shadows enabled, a high number of dynamic lights, soften all smoke edges and an insane amount of corpses.

Company of Heroes - High shader quality, High model quality, Anti-aliasing enabled (in game), Ultra texture quality, high quality shadows, high quality reflections, Post processing On, High building detail, High physics, high tree quality, High terrain detail, Ultra effects fidelity, Ultra effects density, Object scarring enabled and the model detail slider all the way to the right.

F.E.A.R. - 4x FSAA (in game), maximum light details, shadows enabled, maximum shadow details, soft shadows enabled, 16x anisotropic filtering, maximum texture resolution, maximum videos, maximum shader quality.

Half Life 2: Episode 1 - High model detail, high texture detail, high shader detail, reflect all water details, high shadow detail, 4x multi-sample AA (in-game), 16x anisotropic filtering, v-sync disabled, full high-dynamic range.

Pretty much as high as you can crank it.


This post has been edited by dos: Apr 25 2007, 06:18 PM
realone
post Apr 25 2007, 06:13 PM

Enthusiast
*****
Senior Member
865 posts

Joined: Jan 2003


CONFIRMED R600 series out on 14 MAY


source: http://r800.blogspot.com/2007/04/radeon-29...-anandtech.html


Added on April 25, 2007, 6:16 pmR610 & R630 details unveiled ...

user posted image


source : http://www.dailytech.com/article.aspx?newsid=6451

This post has been edited by realone: Apr 25 2007, 06:17 PM
arjuna_mfna
post Apr 25 2007, 09:18 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



at last the real day has come... just bout 2 weeks more... hope this card will cost efficient rather then nvidia card
realone
post Apr 25 2007, 09:35 PM

Enthusiast
*****
Senior Member
865 posts

Joined: Jan 2003


Dun forget that R630 or RV630 has PCI-E 2 supported. nice feature..
arjuna_mfna
post Apr 25 2007, 09:40 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(realone @ Apr 25 2007, 09:35 PM)
Dun forget that R630 or RV630  has PCI-E 2 supported. nice feature..
*
2 bad mobo that able to support pci-ex 2.0 will appear in Q3,it just on high end intel X38 based mobo
realone
post Apr 25 2007, 09:44 PM

Enthusiast
*****
Senior Member
865 posts

Joined: Jan 2003


QUOTE(arjuna_mfna @ Apr 25 2007, 09:40 PM)
2 bad mobo that able to support pci-ex 2.0 will appear in Q3,it just on high end intel X38 based mobo
*
Dun worry guy,, it is backward compatible ... fuyoh,, 2.5Gbps to 5.0Gbps .

more details: http://en.wikipedia.org/wiki/PCI_Express

This post has been edited by realone: Apr 25 2007, 09:45 PM
badguy86
post Apr 26 2007, 02:04 AM

Getting Started
**
Junior Member
292 posts

Joined: Sep 2006
From: Kuching, Sarawak, Malaysia



Can the bandwidth from 2.5Gbps to 5Gbps be fully used? No, I don't think so. Even the current PCI-E 16X also not being fully used up.
TSikanayam
post Apr 26 2007, 06:15 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(badguy86 @ Apr 25 2007, 01:04 PM)
Can the bandwidth from 2.5Gbps to 5Gbps be fully used? No, I don't think so. Even the current PCI-E 16X also not being fully used up.
*
More bandwidth per pin is always nice, and the latency is cut in half, so that's another big improvement especially for latency sensitive devices like sound cards (one of the complaints was that pci-e had higher latency than regular old pci). GPUs are being used for so many general purpose apps now, so that extra bandwidth and lower latency cannot hurt. And if you're running a low end gpu that uses main memory, that surely helps.
badguy86
post Apr 26 2007, 12:45 PM

Getting Started
**
Junior Member
292 posts

Joined: Sep 2006
From: Kuching, Sarawak, Malaysia



QUOTE(ikanayam @ Apr 26 2007, 06:15 AM)
More bandwidth per pin is always nice, and the latency is cut in half, so that's another big improvement especially for latency sensitive devices like sound cards (one of the complaints was that pci-e had higher latency than regular old pci). GPUs are being used for so many general purpose apps now, so that extra bandwidth and lower latency cannot hurt. And if you're running a low end gpu that uses main memory, that surely helps.
*
Oh, u mean the technology like HyperMemory and nVIDIA's TurboCache that implement on those lower end graphic card. Ya, I agree with u in this point. nod.gif
What I mean is current PCI-E can handle cards like Geforce 8800 good enough. smile.gif
TSikanayam
post Apr 26 2007, 01:01 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(badguy86 @ Apr 25 2007, 11:45 PM)
Oh, u mean the technology like HyperMemory and nVIDIA's TurboCache that implement on those lower end graphic card. Ya, I agree with u in this point.  nod.gif
What I mean is current PCI-E can handle cards like Geforce 8800 good enough.  smile.gif
*
Good enough for regular graphics apps on a single card? Yes. However it's not enough for multi-gfx card solutions as we can see. Even pci-e 2.0 is probably not enough to enable truly scalable multi-card solutions. Hopefully Geneseo/pci-e 3.0 will remedy this.
badguy86
post Apr 26 2007, 01:47 PM

Getting Started
**
Junior Member
292 posts

Joined: Sep 2006
From: Kuching, Sarawak, Malaysia



QUOTE(ikanayam @ Apr 26 2007, 01:01 PM)
Good enough for regular graphics apps on a single card? Yes. However it's not enough for multi-gfx card solutions as we can see. Even pci-e 2.0 is probably not enough to enable truly scalable multi-card solutions. Hopefully Geneseo/pci-e 3.0 will remedy this.
*
Geneseo/pci-e 3.0, still in the R&D stage izit? smile.gif
Joseph Hahn
post Apr 26 2007, 02:39 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
This is not looking good...
http://www.dailytech.com/article.aspx?newsid=7052
raist86
post Apr 26 2007, 02:50 PM

Regular
******
Senior Member
1,577 posts

Joined: May 2005
From: USJ


hmm.. could be due to unoptimized drivers... but quite surprised to see the benchmark score that low.. huge difference in some games
CyNics
post Apr 26 2007, 03:29 PM

Getting Started
**
Junior Member
273 posts

Joined: Jan 2003
QUOTE(raist86 @ Apr 26 2007, 02:50 PM)
hmm.. could be due to unoptimized drivers... but quite surprised to see the benchmark score that low.. huge difference in some games
*
how much do you think the driver can help in such case? the gap is too big.
G80 Ultra will be the new king. manufacturers in taiwan haven been happy with R600's performance since..well who cares it's been delayed so many times.

was told by a mobo product manager that ATI made a bizarre request when R600 was still in development (prototype). ATI wanted the manufacturer to extend the size of the motherboard so that the card wouldn't "look" so long laugh.gif champion....

This post has been edited by CyNics: Apr 26 2007, 03:30 PM
zeustronic
post Apr 26 2007, 03:42 PM

Retire OC Into Audiophiles
*******
Senior Member
2,605 posts

Joined: Jan 2006
From: A Place Between Heaven & Hell


relax guyz there no official drivers out yet.... best to wait till launch, see those feed back....
crazykilla
post Apr 26 2007, 04:57 PM

New Member
*
Junior Member
7 posts

Joined: Apr 2007
ATI is defnitely out of the race of DX 10 card at the moment.
arjuna_mfna
post Apr 26 2007, 08:37 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(crazykilla @ Apr 26 2007, 04:57 PM)
ATI is defnitely out of the race of DX 10 card at the moment.
*
looks like that, nvidia dominate dx10 based card nowadays.. but just few dx10 games in market, even dx10 driver not yet full utilized the api...

ati's target, lunch their card at the same time with dx10 games lunch..
badguy86
post Apr 26 2007, 09:44 PM

Getting Started
**
Junior Member
292 posts

Joined: Sep 2006
From: Kuching, Sarawak, Malaysia



We still need to seat back wait and observe. U think AMD ATI will do nothing and just let those new cards launch with such results. sweat.gif
TSikanayam
post Apr 26 2007, 10:31 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

I've been saying, i never expected the top end cards to be zomg. They were late, being late doesn't automatically make you better. It only well... makes you late.

The midrange is the one to look at...
Joseph Hahn
post Apr 26 2007, 11:06 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
I'm very disappointed for now. But i shall wait until the actual launch and proper benchmark on the retail version before i actually buy myself a 8800GTX for my new system. mega_shok.gif
TSikanayam
post Apr 26 2007, 11:24 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

The G80 used in the comparison was quite heavily overclocked, it seems.
linux11
post Apr 27 2007, 01:12 AM

Getting Started
**
Junior Member
104 posts

Joined: Jan 2005
From: Seremban


Most of the money is made in the mainstream section of the GC.

Highend cards are used to claim the performance crown plus bragging rights. Not too many people spending USD300+ for a GC.

RV630 against G84 would be interesting to watch, since this is where the money is. The battle ground is at USD150-200 price range.
Joseph Hahn
post Apr 27 2007, 01:17 AM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
Well it's very much close to the rumored 8800 Ultra clock. tongue.gif
fantagero
post Apr 27 2007, 01:23 AM

[ToFish4RepliesLikeYours]
*******
Senior Member
2,723 posts

Joined: Jan 2006
From: Pekopon Planet ~~~



is it worth to wait the new series??
or just upgrade to 8600 series.. huhuh

derek87
post Apr 27 2007, 06:05 AM

Keep it C.L.E.A.N.
******
Senior Member
1,077 posts

Joined: Nov 2006
From: Sabah,Sandakan Status:STUNNED


QUOTE(Joseph Hahn @ Apr 26 2007, 11:06 PM)
I'm very disappointed for now. But i shall wait until the actual launch and proper benchmark on the retail version before i actually buy myself a 8800GTX for my new system. mega_shok.gif
*
Im in the same situation as you are in now, i got my new system about 2 months ago, then the last thing was my GC, i wanted to put in a POWERFUL GC such as 8800 series into it straight away coz i wanted everything to be "new" as in technology, but then Ati's DX 10 cards are coming out too. So this is making me so confused.. after a little bit of thinking, i asked myself. If i get a 8800 series card now, what games can fully utilize it.. and do i play those games? So.... my point is, buy a card which suits your daily usage, not for self satisfaction only or benchmarking. So at the end,i told myself to wait until Ati pops out then i'll decide which one to buy, ATi or Nvidia.. even though now 8600 series are so tempting too. =p
Radeon
post Apr 27 2007, 09:07 AM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

Frames per second 1920x1200
Game Company of Heroes
Radeon HD
2900 XTX Radeon HD2900 XT GeForce 8800 GTX
53.2 N/A 80

rclxub.gif


WEAK!!!!!

the fact they they delay for half and year which is a lot in the IT world and come up with this result is unacceptable
TSikanayam
post Apr 27 2007, 09:34 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

I would not be so quick to draw conclusions at this point. Some low level tests show the R600 vastly outperforming the G80 in many things, so i'm wondering why the game performance seems bad at this point.

http://images.dailytech.com/files/gpudip_results.html
http://www.notforidiots.com/gpudip%20results.html

Some real reviews will come out soon enough, so i would wait for those before drawing any final conclusions.

Another thing is i don't see why people always expect delayed things to be better. It just doesn't work that way.
Radeon
post Apr 27 2007, 11:54 AM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

it's because since 8800gtx is out, they have should have a reference or target that they must exceed, there is no point launching a card slower with lousier performance. The market share will be long gone by then. Unless ATI really has SERIOUS problems in design and manufacturing. Nevertheless i still have faith in R600 as i still do not quite trust the dailytech biggrin.gif


Added on April 27, 2007, 11:59 amAMD has decided to push forward the launch of the Radeon HD 2900 series to May 2nd instead of the original date on May 14th. Products demonstrations and reviews are allowed to appear on that date so it is considered a soft launch. However, AMD is still keeping Radeon HD 2600 and 2400 under wraps until the big day on May 14th. Radeon HD 2900 XT cards will be available from that day onwards for the price of US$399 to be positioned against the GeForce 8800 GTS. The final clocks for Radeon HD 2900 XT stood at 740MHz core and 825MHz for memories.

http://www.vr-zone.com/?i=4931


some good news perharps?

This post has been edited by Radeon: Apr 27 2007, 11:59 AM
g5sim
post Apr 27 2007, 12:42 PM

Look at all my stars!!
*******
Senior Member
5,757 posts

Joined: Jan 2003
From: Sri Kembangan


QUOTE(Radeon @ Apr 27 2007, 09:07 AM)
Frames per second 1920x1200
Game Company of Heroes
Radeon HD
2900 XTX  Radeon HD2900 XT GeForce  8800 GTX
53.2                  N/A                          80

rclxub.gif
WEAK!!!!!

the fact they they delay for half and year which is a lot in the IT world and come up with this result is unacceptable
*
huh? isnt GTX supposed to be vs XTX? why GTX vs XT lah?
Eoma
post Apr 27 2007, 12:58 PM

- ,. -
Group Icon
Elite
4,603 posts

Joined: Jan 2003
From: PJ


Yup. XT vs GTS and XTX vs GTX. XT is no competitor for the GTX.
XTX will be still a long while more though.
Joseph Hahn
post Apr 27 2007, 01:29 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
QUOTE(Radeon @ April 27, 2007, 11:59 am)
AMD has decided to push forward the launch of the Radeon HD 2900 series to May 2nd instead of the original date on May 14th. Products demonstrations and reviews are allowed to appear on that date so it is considered a soft launch. However, AMD is still keeping Radeon HD 2600 and 2400 under wraps until the big day on May 14th. Radeon HD 2900 XT cards will be available from that day onwards for the price of US$399 to be positioned against the GeForce 8800 GTS. The final clocks for Radeon HD 2900 XT stood at 740MHz core and 825MHz for memories.

http://www.vr-zone.com/?i=4931
some good news perhaps?
*
XT only?! What about XTX ?? rclxub.gif

Radeon
post Apr 27 2007, 10:13 PM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

QUOTE(g5sim @ Apr 27 2007, 12:42 PM)
huh? isnt GTX supposed to be vs XTX? why GTX vs XT lah?
*
its xtx vs xt vs gtx

the xt has no fps info
dos
post Apr 28 2007, 10:31 AM

Getting Started
**
Junior Member
188 posts

Joined: May 2006
http://dailytech.com/ATI+Radeon+HD+2900+XT...article7052.htm

XTX is going to suck from the looks of this. Only a little better than an XT. That's it then, nvidia has got the best card for this round. I hope there'll be more info on 2600 coming and how it compares.
zeustronic
post Apr 28 2007, 02:20 PM

Retire OC Into Audiophiles
*******
Senior Member
2,605 posts

Joined: Jan 2006
From: A Place Between Heaven & Hell


lol... its engineering sample vs 8800GTX factory overclock core speed@ 650Mhz (575Mhz standard) dude... & there no official drivers for ATi. lol ATi sucks?? Imagine without ATi, we can get cheap price from NVidia. For me the Guru3D is the best trusted sites for GC comparison, coz Guru3D done their review much more detail....


here's is the comparison HD2900XT VS 8800GTS....
Link

This post has been edited by zeustronic: Apr 28 2007, 02:24 PM
t3chn0m4nc3r
post Apr 28 2007, 05:20 PM

Teh Necron Lord
*******
Senior Member
4,139 posts

Joined: Sep 2006
From: Internet


my board supports only crossfire... so i'll have to wait... unless i change my mobo... sad.gif
TSikanayam
post Apr 28 2007, 11:55 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

Attached Image

R600 block diagram.

Source: http://bbs.expreview.com/
dos
post Apr 29 2007, 10:18 AM

Getting Started
**
Junior Member
188 posts

Joined: May 2006
QUOTE(zeustronic @ Apr 28 2007, 02:20 PM)
lol... its engineering sample vs 8800GTX factory overclock core speed@ 650Mhz (575Mhz standard) dude... & there no official drivers for ATi. lol ATi sucks?? Imagine without ATi, we can get cheap price from NVidia. For me the Guru3D is the best trusted sites for GC comparison, coz Guru3D done their review much more detail....
here's is the comparison HD2900XT VS 8800GTS....
Link
*
You say you trust another website more and you provide a dailytech link... and the in-depth reviews are all under embargo probably currently.

Besides I said nothing about price only performance. The XTX does not look like it'll be able to compete with the GTX. XTX too similar to the XT to offer big advantage like GTX and GTS. Maybe R650 will be XTX when it's ready.
zeustronic
post Apr 29 2007, 06:31 PM

Retire OC Into Audiophiles
*******
Senior Member
2,605 posts

Joined: Jan 2006
From: A Place Between Heaven & Hell


QUOTE(dos @ Apr 29 2007, 11:18 AM)
You say you trust another website more and you provide a dailytech link... and the in-depth reviews are all under embargo probably currently.

Besides I said nothing about price only performance. The XTX does not look like it'll be able to compete with the GTX. XTX too similar to the XT to offer big advantage like GTX and GTS. Maybe R650 will be XTX when it's ready.
*
well... yup the comprison link that i missed mention that N/A for HD2900XT fps company of heros.... I tot ur XTX sucks refer not good, wut i'm saying is oso part true facts, when latest ATi out NVidia will drop their price. & bout the price issue I telling it to everybody nto onli u... Everybody like price drop don they??? For me HD2900XT or XTX not sucks at all coz the feature they've much more GTX, feature that not for now, might be useful sooner or later.... part of a discussion i'm not bashing u...

This post has been edited by zeustronic: Apr 29 2007, 06:34 PM
Najmods
post Apr 29 2007, 07:00 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


This stuff is not out yet guys, so any benchmark could be a hoax
arjuna_mfna
post Apr 29 2007, 08:43 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



read some review that ATi HD2X00 series will lunch on 2nd May, is that for real?

stevenlee
post Apr 29 2007, 08:51 PM

look @ my Star *o* ...hoho
*******
Senior Member
3,158 posts

Joined: Apr 2005
From: -Butterworth, Penang-



swt...i just place order for my 8800gts....if it launch at 2 may than maybe price war will happen....when the time i receive my 8800gts than it was more expensive edi T_T
Najmods
post Apr 29 2007, 09:02 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(arjuna_mfna @ Apr 29 2007, 08:43 PM)
read some review that ATi HD2X00 series will lunch on 2nd May, is that for real?
*
Its out when Its out. I heard of that too, but I guess that just a soft launch if its true
TSikanayam
post Apr 30 2007, 05:51 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

http://www.forum-3dcenter.org/vbulletin/sh...5&postcount=351

http://www.forum-3dcenter.org/vbulletin/sh...3&postcount=364

Hehe, leaked marketing slides. Looks pretty fancy. Those custom filters are fancy.

RV630
http://img215.imageshack.us/img215/3317/hd2600wg2.jpg

RV610
http://img214.imageshack.us/img214/5963/hd2400xl4.jpg

Specs updated on 1st page.

This post has been edited by ikanayam: Apr 30 2007, 05:58 AM
LExus65
post Apr 30 2007, 09:49 AM

Old Gezzer.....
******
Senior Member
1,995 posts

Joined: May 2005


i tot launch 15th of may............ hrm.........we might expect price drop form nvidia soon.......
Faint
post Apr 30 2007, 10:06 AM

Moving forward :)
*******
Senior Member
2,474 posts

Joined: Dec 2006
Anyone know the firm date of ATI R600 launch?
Wait for so long time liao.... cry.gif
TSikanayam
post Apr 30 2007, 10:10 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

The press have been briefed and they all have their cards now. So not much longer obviously.
gtoforce
post Apr 30 2007, 03:26 PM

SPAM AND BECOME A SENIOR MEMBER
*******
Senior Member
2,967 posts

Joined: May 2006



the conference had already stated mid may
Joseph Hahn
post Apr 30 2007, 07:00 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
If VR-Zone information is correct, in 2 days ? o.o I hunger for UVD. ._.


Added on April 30, 2007, 7:03 pmLOL nvm that article on VR-Zone is GONE! XD

This post has been edited by Joseph Hahn: Apr 30 2007, 07:03 PM
cstkl1
post Apr 30 2007, 09:22 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

err i think u guys better state that the 2900xt has only 64 PHYSICAL stream processors

the 320 processing unit a bit misleading as actually its just op/clock for the whole card.

Roughly This is the difference with the cards

first remember the R600 has 2 flops per unit while the G80 has only 1

So lets do some number
Cards |8800GTX| |2900XT| |8800GTS| |2600XT| |8600Gts|
Gigaflops |518.4| |475| |345.6| |192| |139.2|
Physical Shaders |128| |64| |96| |24| |32|
Shader Clocks |1.35| |0.742| |1.2| |0.8| |1.45|
Ops/Clock |384| |320| |288| |120| |96|
Ops/Clock per Shader |3| |5| |3| |5| |3|

most games today can only do 2 ops per each cycle clock
so i think that explains y the performance

Nvidia G80's Gigaflops
http://en.wikipedia.org/wiki/GeForce_8_Series

Correct me on this
Advantage on ATI - 2 Flops per Processing Unit
Much More advance Physical Processor = 5 ops/cycle

Advantage on G80 - Well most games only support around 2-3 ops/cycle.


Same arguement of intel c2d 11ops/cycle vs AMDs x2 9ops/cycle.
Thats y intel is energy efficient.
Totally different architecture and the xt will kick the gts on my opinion on the 2900 and 2600....just wondering what the flops on the x2900xt was....

thumbs up for the r600.. more efficient

hot off the press
this is the spec of the 8800gtx ultra and PICs!!!
crazy
[attachmentid=225190]

GPU Hızı: 675MHz
- Bellek Hızı: 2350MHz (1.0ns)
- Bellek Tipi: GDDR3
- Bellek Boyutu: 768MB
- Bellek Veri Yolu: 384-Bit
- Bant Genişliği: 112.8 GB/sn
- Shader Hızı: 1600MHz
- Stream İşlemcisi: 128 adet
- GPU Üretim Teknolojisi: 90nm
does anybody can translate and see whether this is rubbish or not..
http://www.darkhardware.com/phpbb/viewtopic.php?p=1088803

sounds bs at 2350 because of the 80nm gddr3 limit is around 2.2ghz



This post has been edited by cstkl1: Apr 30 2007, 09:47 PM
TSikanayam
post Apr 30 2007, 11:13 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

I'm really not sure where you get your theories from... but no, it doesn't work that way.
cstkl1
post Apr 30 2007, 11:44 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ikanayam @ Apr 30 2007, 11:13 PM)
I'm really not sure where you get your theories from... but no, it doesn't work that way.
*
uik
i thought that was the only way two look at two different types of gpu with difference architecture was to look at the point of ops/clock for the whole card and ops/clock per shader.


TSikanayam
post Apr 30 2007, 11:58 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(cstkl1 @ Apr 30 2007, 10:44 AM)
uik
i thought that was the only way two look at two different types of gpu with difference architecture was to look at the point of ops/clock for the whole card and ops/clock per shader.
*
Sure you can compute the GFLOPs rating, but it doesn't tell you anything about efficiency, and those ops aren't the same as well. A different instruction mix will show very different results. So it's a nice theoretical figure, but it hardly explains game performance.

And i'm not even sure where you get your "most games today can do 2 ops per each cycle clock", it doesn't even make sense. Games don't know about ops/clock or whatever.
cstkl1
post May 1 2007, 12:09 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ikanayam @ Apr 30 2007, 11:58 PM)
Sure you can compute the GFLOPs rating, but it doesn't tell you anything about efficiency, and those ops aren't the same as well. A different instruction mix will show very different results. So it's a nice theoretical figure, but it hardly explains game performance.

And i'm not even sure where you get your "most games today can do 2 ops per each cycle clock", it doesn't even make sense. Games don't know about ops/clock or whatever.
*
yeah thats what i thought too and this guy corrected me at amd forum like a year back about the 7900gtx vs 1900xtx when there was an arguement that the older 6 series and 7800gtx cannot sustain the the quote gigaflops.

looking for the post now.
empire23
post May 1 2007, 12:59 PM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory
I've seen pretty valuable information from ATi slides from yesterday's Press Release, i'm not impressed by the R600, but the RV630 is fukken awesome for the rated power figures.
almostthere
post May 1 2007, 04:54 PM

Kepala abah ko
Group Icon
VIP
3,773 posts

Joined: Jan 2003
From: Anywhere lah...as long got Kopi-O



And no one's even posted pictars of the full HD lineup yet? Someone should do that soon

AS revealed over at XS, here are thar pictar's

http://directupload.com/files/ijqdnznu4xtnr35tdnyc.jpg
http://directupload.com/files/mmdkjryoeoixztizydxj.jpg
HD2400

http://directupload.com/files/nddrzjm5jknwzjoh5jka.jpg
HD2600 Pro

http://directupload.com/files/jflmnynmjmgy4jjnxtqj.jpg
HD2600 XT

http://img219.imageshack.us/my.php?image=4...298cebbeew2.jpg
HD2900 XT

Driver:

user posted image
XP

user posted imageuser posted imageuser posted image
Vista

This post has been edited by almostthere: May 1 2007, 04:54 PM
fantagero
post May 1 2007, 05:23 PM

[ToFish4RepliesLikeYours]
*******
Senior Member
2,723 posts

Joined: Jan 2006
From: Pekopon Planet ~~~



QUOTE(fantagero @ Apr 27 2007, 01:23 AM)
is it worth to wait the new series??
or just upgrade to 8600 series.. huhuh
*
so.. it's around May eyh...
better wait to and start saving.. rclxub.gif
derek87
post May 2 2007, 07:59 AM

Keep it C.L.E.A.N.
******
Senior Member
1,077 posts

Joined: Nov 2006
From: Sabah,Sandakan Status:STUNNED


hope it's out today!! loL!! checking it out today at lyp. I have been waiting for this baby for a very long time!!!
Faint
post May 2 2007, 11:59 AM

Moving forward :)
*******
Senior Member
2,474 posts

Joined: Dec 2006
Does anyone here already go to lowyat? Got ATI X2k series GC sell?
Zicron
post May 2 2007, 12:18 PM

Come to the Dark Side
*****
Senior Member
899 posts

Joined: Mar 2005
From: Klang


Hd2900XT crossfire vs 8800GTS SLI

3dmark2006 1600x1280

8800gts - 3226
HD2900XT - 3170 sweat.gif
Faint
post May 2 2007, 12:25 PM

Moving forward :)
*******
Senior Member
2,474 posts

Joined: Dec 2006
QUOTE(Zicron @ May 2 2007, 12:18 PM)
Hd2900XT crossfire vs 8800GTS SLI

3dmark2006 1600x1280

8800gts - 3226
HD2900XT - 3170  sweat.gif
*
WT......... Impossible ATI will lose der, 8800 still using GDDR3, while 2900 is GDDR4. How come?
kucalana
post May 2 2007, 12:49 PM

Getting Started
**
Junior Member
151 posts

Joined: Dec 2005


2900xt using ddr3. 2900xtx using ddr4.read again.
cstkl1
post May 2 2007, 12:54 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

yeah but check out the crossfire action

shows that nvidia sli still got a lot of problems

ruffstuff
post May 2 2007, 01:00 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(Faint @ May 2 2007, 11:59 AM)
Does anyone here already go to lowyat? Got ATI X2k series GC sell?
*
I just got back from lowyat. I bought 2 of HD2900XT. The stock is very limited. I don't think theres more left at lowyat. It's selling like hot cakes. You too late son. I've tested with 3dmark2006 and the score is over nine hundred thousand!!! This is because HD2900XT is using GDDR4.

Btw, X2K series is for the mobile market. lulz.
Zicron
post May 2 2007, 01:41 PM

Come to the Dark Side
*****
Senior Member
899 posts

Joined: Mar 2005
From: Klang


QUOTE(ruffstuff @ May 2 2007, 01:00 PM)
I just got back from lowyat.  I bought 2 of HD2900XT. The stock is very limited. I don't think theres more left at lowyat.  It's selling like hot cakes.  You too late son.  I've tested with 3dmark2006 and the score is over nine hundred thousand!!! This is because HD2900XT is using GDDR4.

Btw, X2K series is for the mobile market. lulz.
*
How much u bought it? brows.gif
Nemesis181188
post May 2 2007, 01:43 PM

On my way
****
Senior Member
635 posts

Joined: Feb 2006
Wow.What brand did you bought?Give us some reviews yeah.I didn't know it was out so fast. sweat.gif

This post has been edited by Nemesis181188: May 2 2007, 03:01 PM
Joseph Hahn
post May 2 2007, 02:49 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
QUOTE(ruffstuff @ May 2 2007, 01:00 PM)
I just got back from lowyat.  I bought 2 of HD2900XT. The stock is very limited. I don't think theres more left at lowyat.  It's selling like hot cakes.  You too late son.  I've tested with 3dmark2006 and the score is over nine hundred thousand!!! This is because HD2900XT is using GDDR4.

Btw, X2K series is for the mobile market. lulz.
*
GB2/B/ and f*** yourself.


Added on May 2, 2007, 2:54 pmBTW what about XTX ? Is it really delayed ?

This post has been edited by Joseph Hahn: May 2 2007, 02:54 PM
RokXIII
post May 2 2007, 04:25 PM

C'est la vie, Chérie
******
Senior Member
1,634 posts

Joined: Mar 2006
From: Ipoh @ Puchong


QUOTE(ruffstuff @ May 2 2007, 01:00 PM)
I just got back from lowyat.  I bought 2 of HD2900XT. The stock is very limited. I don't think theres more left at lowyat.  It's selling like hot cakes.  You too late son.  I've tested with 3dmark2006 and the score is over nine hundred thousand!!! This is because HD2900XT is using GDDR4.

Btw, X2K series is for the mobile market. lulz.
*
over 900000 in 3DMark 06??!! blink.gif Really or not....

heterosapiens
post May 2 2007, 04:28 PM

--capoeira--
******
Senior Member
1,379 posts

Joined: Oct 2004
From: => cyberjaya <=


QUOTE(Zicron @ May 2 2007, 01:41 PM)
How much u bought it? brows.gif
*
QUOTE(Nemesis181188 @ May 2 2007, 01:43 PM)
Wow.What brand did you bought?Give us some reviews yeah.I didn't know it was out so fast. sweat.gif
*
QUOTE(RokXIII @ May 2 2007, 04:25 PM)
over 900000 in 3DMark 06??!! blink.gif Really or not....
*
Internet always find its way to amuse me. laugh.gif
Faint
post May 2 2007, 05:06 PM

Moving forward :)
*******
Senior Member
2,474 posts

Joined: Dec 2006
QUOTE(ruffstuff @ May 2 2007, 01:00 PM)
I just got back from lowyat.  I bought 2 of HD2900XT. The stock is very limited. I don't think theres more left at lowyat.  It's selling like hot cakes.  You too late son.  I've tested with 3dmark2006 and the score is over nine hundred thousand!!! This is because HD2900XT is using GDDR4.

Btw, X2K series is for the mobile market. lulz.
*
900,000 marks 3Dmark2006??? Really? Can show some picture to prove it?
akachester
post May 2 2007, 05:15 PM

Its Life. Live with it!
*******
Senior Member
7,689 posts

Joined: Jul 2005
From: The Land of No Return


lol..Please be serious here...
arjuna_mfna
post May 2 2007, 05:50 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(Zicron @ May 2 2007, 12:18 PM)
Hd2900XT crossfire vs 8800GTS SLI

3dmark2006 1600x1280

8800gts - 3226
HD2900XT - 3170  sweat.gif
*
3dmark just for referance, the important is n real game...
Joseph Hahn
post May 2 2007, 06:48 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
Dudes he's just bullshitting here. Ignore him.

Anyway, the XT seems better in most of the games. Some by quite a margin but some it loses to the GTS.
riku2replica
post May 2 2007, 09:19 PM

Mugi-chan!! 可愛い!!
*******
Senior Member
3,304 posts

Joined: Mar 2006
From: Chicago(Port25)
Maybe coz it seems to be more powerful in DX9 games compared to DX10 games.
badguy86
post May 2 2007, 09:24 PM

Getting Started
**
Junior Member
292 posts

Joined: Sep 2006
From: Kuching, Sarawak, Malaysia



Oh, now I know liaw! A senior group member, with 2k posts is actually gained by bullshitting around. sweat.gif Or maybe he just want to bring in some joke while waiting for ATI Radeon HD series to launch! tongue.gif
TSikanayam
post May 2 2007, 09:51 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

He was being sarcastic. Man, so many sarcasm failgets today.
empire23
post May 2 2007, 11:30 PM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory
QUOTE(badguy86 @ May 2 2007, 09:24 PM)
Oh, now I know liaw! A senior group member, with 2k posts is actually gained by bullshitting around. sweat.gif  Or maybe he just want to bring in some joke while waiting for ATI Radeon HD series to launch!  tongue.gif
*
Geez it was just a sarcastic comment.

Anyways alot of Info about the R6XX is kind of out. And the HW launch date is erm......gay, very gay.
Joseph Hahn
post May 3 2007, 12:17 AM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
Sorry i just hate /b/tards so much. Burn in hell all of em. >_>

Still, any idea on XTX ? Goddammit...
almostthere
post May 3 2007, 01:02 AM

Kepala abah ko
Group Icon
VIP
3,773 posts

Joined: Jan 2003
From: Anywhere lah...as long got Kopi-O



QUOTE(ruffstuff @ May 2 2007, 01:00 PM)
I just got back from lowyat.  I bought 2 of HD2900XT. The stock is very limited. I don't think theres more left at lowyat.  It's selling like hot cakes.  You too late son.  I've tested with 3dmark2006 and the score is over nine hundred thousand!!! This is because HD2900XT is using GDDR4.

Btw, X2K series is for the mobile market. lulz.
*
owi, ko nak kena marah ngan aku ke tak? Sejak bila H/w boleh jadi bebenang kat kopitiam tuh?

(And I'm being serious here)
CKC_1
post May 3 2007, 08:34 AM

On my way
****
Senior Member
552 posts

Joined: May 2006



omg..i just bought my 8800gts and now ATI wana release theirs..X.X" im always supporting ATI but now i just hope their card wont beat the 8800gts! go nvidia!!
Radeon
post May 3 2007, 08:44 AM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

the sample cards are indeed arrived in lowyat
price should be around RM 1.5k
performance not sure
that is for the XT

thats the info i got
arjuna_mfna
post May 3 2007, 09:51 AM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



check out this link... his HD2900XT for real...
http://www.hisdigital.com/html/product_ov.php?id=304
Radeon
post May 3 2007, 04:12 PM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

i can't see any words or pictures on that page
shady
post May 3 2007, 04:18 PM

The Real Slim Shady
*******
Senior Member
2,818 posts

Joined: Jan 2003
From: Hokkaido, Japan



Here are the screenshots taken by techpowerup

user posted image

user posted image

user posted image


Radeon
post May 3 2007, 10:18 PM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

apparently,
the content of the website is removed,
however i still can browser the images


http://www.hisdigital.com/newimages/products/

here
shady
post May 4 2007, 12:13 AM

The Real Slim Shady
*******
Senior Member
2,818 posts

Joined: Jan 2003
From: Hokkaido, Japan



Fudzilla

Lots of rumours out there. 750W PSU?
riku2replica
post May 4 2007, 01:50 AM

Mugi-chan!! 可愛い!!
*******
Senior Member
3,304 posts

Joined: Mar 2006
From: Chicago(Port25)
But still, i'm waiting for ATi HD X2900XT to slash other ATi graphic card like X1950XT...
arjuna_mfna
post May 4 2007, 08:18 AM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(Radeon @ May 3 2007, 04:12 PM)
i can't see any words or pictures on that page
*
it looks the content of the page hv been remove.. i believe that the card are ready to enter the market, just wait the official launch..
Radeon
post May 4 2007, 11:30 PM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

Found some screenshots for half life 2

user posted image

user posted image

user posted image

user posted image

user posted image

user posted image

Source:
http://www.pcpop.com/doc/0/192/192401_1.shtml
fantagero
post May 4 2007, 11:40 PM

[ToFish4RepliesLikeYours]
*******
Senior Member
2,723 posts

Joined: Jan 2006
From: Pekopon Planet ~~~



QUOTE(riku2replica @ May 4 2007, 01:50 AM)
But still, i'm waiting for ATi HD X2900XT to slash other ATi graphic card like X1950XT...
*
me too.... i not that serious gamers.. but if the release the 2k mid range series like 8600.. maybe can consider.. huhuuhh

warghh the longer i keep my money, it jsut keep decreasing.. huhuh
Oly
post May 5 2007, 12:03 AM

Permanent Banned
*******
Senior Member
3,827 posts

Joined: Jan 2003
From: Here On The Chair Status : Eating Donut

ops...wrong thread...

This post has been edited by Oly: May 5 2007, 12:04 AM
Najmods
post May 5 2007, 01:29 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


I'm pretty much waiting for X2600XT (If thats the midrange to compete with 8600GTS). If it comes with PCIe power connector and dual slot cooling then I'll definitely buy it (you don't read it wrong, I love dual slot cooling and PCIe power connector on cards)
TSikanayam
post May 5 2007, 02:09 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(Najmods @ May 4 2007, 12:29 PM)
I'm pretty much waiting for X2600XT (If thats the midrange to compete with 8600GTS). If it comes with PCIe power connector and dual slot cooling then I'll definitely buy it (you don't read it wrong, I love dual slot cooling and PCIe power connector on cards)
*
You mean... you like hotter and more power hungry cards? tongue.gif
Faint
post May 5 2007, 03:09 AM

Moving forward :)
*******
Senior Member
2,474 posts

Joined: Dec 2006
Got any news about the price for X2xxx series graphic card?
phas3r
post May 5 2007, 03:33 AM

Regular
******
Senior Member
1,206 posts

Joined: Sep 2006


user posted image
user posted image
how does those benchies compared with 8800 gts and gtx?..
but anyway, teh xt will just compete with teh gts
Radeon
post May 5 2007, 08:05 AM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

phaser where is your source from?

imho it's a bit exaggerated
but i hope its true also
banjarconverto
post May 5 2007, 08:51 AM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
QUOTE
just got back from lowyat.  I bought 2 of HD2900XT. The stock is very limited. I don't think theres more left at lowyat.  It's selling like hot cakes.  You too late son.  I've tested with 3dmark2006 and the score is over nine hundred thousand!!! This is because HD2900XT is using GDDR4.


What?? Nine hundred thousand?? Are you sure??? That's 900,000 score in 3dmark06!! Be real man! shakehead.gif shakehead.gif

heheh..anyway, those graphs are way sick. more than 2x over previous gen. i reckon those are reliable, after witnessing the quite similar jump in Nvidia's counterpart. unified shader technology really kick some major azz..

This post has been edited by banjarconverto: May 5 2007, 08:57 AM
Najmods
post May 5 2007, 10:04 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(ikanayam @ May 5 2007, 02:09 AM)
You mean... you like hotter and more power hungry cards? tongue.gif
*

nod.gif but I don't like noisy cards

derek87
post May 5 2007, 01:24 PM

Keep it C.L.E.A.N.
******
Senior Member
1,077 posts

Joined: Nov 2006
From: Sabah,Sandakan Status:STUNNED


QUOTE(Najmods @ May 5 2007, 10:04 AM)
nod.gif  but I don't like noisy cards
*
strange... why like hotter and hungry for power cards?Your gc will dmg faster and your electric bills will shoot up high... "special fellow"
X.E.D
post May 6 2007, 10:42 AM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


Benchies for Doom 3 seem to be funny.
Wasn't that optimized for nVidia paths? DAAMIT really slaughtered stencil shadowing for real, this time?

BTW, HD2400's shape (Fudzilla) is pure lulz.
Najmods
post May 6 2007, 12:20 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(derek87 @ May 5 2007, 01:24 PM)
strange... why like hotter and hungry for power cards?Your gc will dmg faster and your electric bills will shoot up high... "special fellow"
*
What you basically mean is 8800GTX will die faster than 7600GT? Not really, some cards are perfectly fine play at high temps, if not why do they bother release it in the first place? Who say having a power hungry cards will increase electrical bill, if they really do, I don't mind at all
LExus65
post May 6 2007, 12:33 PM

Old Gezzer.....
******
Senior Member
1,995 posts

Joined: May 2005


extra power cord needed does not meant extra power hungry, but does hint there is a good room for overclocking
TSikanayam
post May 6 2007, 12:35 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(LExus65 @ May 5 2007, 11:33 PM)
extra power cord needed does not meant extra power hungry, but does hint there is a good room for overclocking
*
Interesting. So is the power cord is just there to make it look pretty?


Added on May 6, 2007, 12:53 pm
QUOTE(X.E.D @ May 5 2007, 09:42 PM)
Benchies for Doom 3 seem to be funny.
Wasn't that optimized for nVidia paths? DAAMIT really slaughtered stencil shadowing for real, this time?

BTW, HD2400's shape (Fudzilla) is pure lulz.
*
What doom3 benchmarks are you referring to?

Doom3 is not nvidia optimized for the newer cards. Carmack had to make special paths for nv3x because it sucked balls. The newer cards should all run on the standard ARB path.

This post has been edited by ikanayam: May 6 2007, 12:53 PM
sleepy
post May 6 2007, 03:11 PM

Regular
******
Senior Member
1,139 posts

Joined: Jan 2003
Anyone read this yet?

http://www.xtremesystems.org/forums/showthread.php?t=143104
TSikanayam
post May 6 2007, 03:21 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

Assuming the tests are correct, i've been hearing that the G80 may be triangle setup limited. So for apps such as wireframe modeling, it should get smacked by the R600. A lot of professional apps like that use a lot more geometry so that may be why the R600 has an advantage.
salimbest83
post May 6 2007, 03:30 PM

♥PMS on certain day♥
*******
Senior Member
8,647 posts

Joined: Feb 2006
From: Jelutong Penang



QUOTE(Najmods @ May 6 2007, 12:20 PM)
What you basically mean is 8800GTX will die faster than 7600GT? Not really, some cards are perfectly fine play at high temps, if not why do they bother release it in the first place? Who say having a power hungry cards will increase electrical bill, if they really do, I don't mind at all
*
but sometime we need to recalculate total power usage for only our PC..

and if we running our PC 24/7..

lets say 700W total..

0.700kW*24hour*7day = 117.6kW per week
then our current TNB rates.. RM0.218....

u got RM25.63 per week
MNurdin
post May 6 2007, 04:32 PM

New Member
*
Newbie
3 posts

Joined: Apr 2007


QUOTE(Najmods @ May 6 2007, 12:20 PM)
What you basically mean is 8800GTX will die faster than 7600GT? Not really, some cards are perfectly fine play at high temps, if not why do they bother release it in the first place? Who say having a power hungry cards will increase electrical bill, if they really do, I don't mind at all
*
I ran Gigabyte 7300GT(silent pipe) wif closed casing and no fan... the color become cacat and performance really pisses any1... put a fan under it then OK d... temp prob...

capasitors lifetime depends on the heat levels meaning if u freeze(temp wise) ur capasitors permanantly it will last maybe forever as long as it's kept frozen... tat's wat i heard la... icon_rolleyes.gif
banjarconverto
post May 6 2007, 06:02 PM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
Big news for those who are expecting R600 this mid May, they are all made under 65-nm process. Read more here : The Inquirer

But of course, The Inquirer...heheheh rclxms.gif rclxms.gif

Anyway, here's a presentation from AMD/ATi about R600 : Link Here

This post has been edited by banjarconverto: May 6 2007, 06:15 PM
derek87
post May 6 2007, 07:12 PM

Keep it C.L.E.A.N.
******
Senior Member
1,077 posts

Joined: Nov 2006
From: Sabah,Sandakan Status:STUNNED


QUOTE(Najmods @ May 6 2007, 12:20 PM)
What you basically mean is 8800GTX will die faster than 7600GT? Not really, some cards are perfectly fine play at high temps, if not why do they bother release it in the first place? Who say having a power hungry cards will increase electrical bill, if they really do, I don't mind at all
*
Dies faster in the long term, not in 1 or 2 years time, perhaps 4~6 years? It kills the lifespan of the card. Just my opinion. And also for 24/7 users (like me), high heat/power hungry cards is definitely a problem for them. Card which releases high heat affects the ambience in the casing, which means heat accumulates inside the casing. Of course, the solution is to turn on the air con, but we are trying to save what we can save(money of course). =)

I wouldnt mind too if it's a high heat and power hungry card, but of course it will be better if it releases lower heat and less power hungry. What do you say pal?

This post has been edited by derek87: May 6 2007, 07:14 PM
TSikanayam
post May 7 2007, 12:47 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(banjarconverto @ May 6 2007, 05:02 AM)
Big news for those who are expecting R600 this mid May, they are all made under 65-nm process. Read more here : The Inquirer

But of course, The Inquirer...heheheh rclxms.gif  rclxms.gif
*
Nope, to the best of my knowledge, that is not true. It will be 80nm.
banjarconverto
post May 7 2007, 01:28 AM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
Yep, that's why..The Inquirer..hahahaha...

Although, early adopters of R600 are adviced to wait for the 65-nm version, but...money talks. laugh.gif laugh.gif
TSikanayam
post May 7 2007, 01:38 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(banjarconverto @ May 6 2007, 12:28 PM)
Yep, that's why..The Inquirer..hahahaha...

Although, early adopters of R600 are adviced to wait for the 65-nm version, but...money talks. laugh.gif  laugh.gif
*
Do you even know there will be a 65nm version of the R600? I don't see any coming anytime soon. The high end refresh is a long way off, unless some serious change of plans were made.
g5sim
post May 7 2007, 03:35 AM

Look at all my stars!!
*******
Senior Member
5,757 posts

Joined: Jan 2003
From: Sri Kembangan


Yeeehawwww HD2**** cards DVI to HDMI dongle leaked biggrin.gif biggrin.gif biggrin.gif

http://www.theinquirer.net/default.aspx?article=39370

OUR OWN WILY FERRET ran couple of HD2900XT stories earlier in the day, touching n the subject of HDMI dongle. This unique feature brings HDMI not just to high-end, but to the mainstream and low-end markets as well - without messing up the dual-display capability, which was a norm for first-gen cards with HDMI.
We asked around some of AMD's channel partners and managed to snag a picture of the dongle that will bring HDTV connectivity to the masses.

So, without further ado, here is the adapter that is giving Graphzilla nightmares since it means the R600 can do HDCP, Daamit, in any resolution out there. Same thing goes for integrated sound.

user posted image

As you can see, it looks very simple. Regardless of all the delays that have plagued AMD and ATI, we have to admit that this dongle just looks as it should - simple. We hope that it works as advertised. µ


TSikanayam
post May 7 2007, 07:15 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(g5sim @ May 6 2007, 02:35 PM)
Yeeehawwww HD2**** cards DVI to HDMI dongle leaked biggrin.gif biggrin.gif biggrin.gif

http://www.theinquirer.net/default.aspx?article=39370

OUR OWN WILY FERRET ran couple of HD2900XT stories earlier in the day, touching n the subject of HDMI dongle. This unique feature brings HDMI not just to high-end, but to the mainstream and low-end markets as well - without messing up the dual-display capability, which was a norm for first-gen cards with HDMI.
We asked around some of AMD's channel partners and managed to snag a picture of the dongle that will bring HDTV connectivity to the masses.

So, without further ado, here is the adapter that is giving Graphzilla nightmares since it means the R600 can do HDCP, Daamit, in any resolution out there. Same thing goes for integrated sound.

user posted image

As you can see, it looks very simple. Regardless of all the delays that have plagued AMD and ATI, we have to admit that this dongle just looks as it should - simple. We hope that it works as advertised. µ
*
The ignorance is appalling.

1. A HDMI adapter has ABSOLUTELY NOTHING to do with the fact that the R600 can do HDCP in "any resolution out there".

2. The R600 can do HDCP up to dual-link DVI modes. Which is technically not exactly "any resolution out there", but really that's all the resolutions it supports. It's better than the G80 which can do it only for single link modes even though it can output unprotected content at dual link rates.

3. There will be limitations to using the HDMI dongle. Doesn't work for "any resolution out there", to put it in inqspeek. lol.

This post has been edited by ikanayam: May 7 2007, 07:17 AM
Joseph Hahn
post May 7 2007, 12:26 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
I've been wondering whether if the XTX will come out along with the XT. Any ideas ?
skydna
post May 7 2007, 01:50 PM

Getting Started
**
Junior Member
236 posts

Joined: Jan 2003
QUOTE(Joseph Hahn @ May 7 2007, 12:26 PM)
I've been wondering whether if the XTX will come out along with the XT. Any ideas ?
*
just see from some news that AMD wont launch the 2900xtx version
so the 8800GTX and Ultra still monopoly in the market
XTREME
post May 7 2007, 02:05 PM

 
*******
Senior Member
4,599 posts

Joined: Jan 2003


for those who haven't read yet sweat.gif

http://www.nordichardware.com/forum/viewto...c=8428&forum=45 rolleyes.gif

more discussions here:
http://www.xtremesystems.org/forums/showthread.php?t=143190 whistling.gif

hmm.gif

brows.gif


Attached image(s)
Attached Image
banjarconverto
post May 7 2007, 03:50 PM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
QUOTE
Do you even know there will be a 65nm version of the R600? I don't see any coming anytime soon. The high end refresh is a long way off, unless some serious change of plans were made.


Browse around, please. The plan to migrate to 65-nm is not a rumour, it's a plan, what rumour is 'when'. brows.gif

Anyway, Here's a low down on AMD mid range R6XX :

user posted image

Further stories is here!

So, best of luck in cracking your head on which one to buy soon.. whistling.gif whistling.gif

This post has been edited by banjarconverto: May 7 2007, 04:16 PM
TSikanayam
post May 7 2007, 04:18 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(banjarconverto @ May 7 2007, 02:50 AM)
Browse around, please. The plan to migrate to 65-nm is not a rumour, what rumour is 'when'.  brows.gif
*
Browse around where? Where is it "confirmed" that they are going to make a 65nm R600?
banjarconverto
post May 7 2007, 04:30 PM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
This is one of them : Here

I won't spoon feed you with the rest, just Google around with 'R600 65-nm' will do the trick. Like I said, 'WHEN' is the rumour, but it is a plan.
empire23
post May 7 2007, 04:44 PM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory
QUOTE(banjarconverto @ May 7 2007, 04:30 PM)
This is one of them : Here

I won't spoon feed you with the rest, just Google around with 'R600 65-nm' will do the trick. Like I said, 'WHEN' is the rumour, but it is a plan.
*
Erm, that's not even correct. That's a plan of allignment. The 65 nanometers take up the value and mainstream segment due to their lower speed and transistor count while 80 nanometer HS serves only the R600.

And no, ATI has no plans to offer a 65nm R600. Screw the NDA anyways, i heard this from AMD/ATI themselves. Yields are just too low and silicon just not fast enough for the R600 yet.

If you wanna spoonfeed people. Don't spoon feed shit lah.
almostthere
post May 7 2007, 04:48 PM

Kepala abah ko
Group Icon
VIP
3,773 posts

Joined: Jan 2003
From: Anywhere lah...as long got Kopi-O



That is the problem with philistines, as if it's easy to respin a design to a smaller process within months when those in the industry would shake their heads and tell you that's ridiculous. It's like saying for let's make a Proton Campro with 20 valves and put it into production in 6 months time. Gawd I love simple assumptions

This post has been edited by almostthere: May 7 2007, 04:49 PM
banjarconverto
post May 7 2007, 04:49 PM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
OK, maybe I should have indicate R6XX, instead of R600 which indicates high end range. I apologize from the typo, but I didn't mean to provide shit whatever to you guys.

And your respond is undeniably rude.
almostthere
post May 7 2007, 04:51 PM

Kepala abah ko
Group Icon
VIP
3,773 posts

Joined: Jan 2003
From: Anywhere lah...as long got Kopi-O



QUOTE(banjarconverto @ May 7 2007, 04:49 PM)
OK, maybe I should have indicate R6XX, instead of R600 which indicates high end range. I apologize from the typo, but I didn't mean to provide shit whatever to you guys.

And your respond is undeniably rude.
*
Be precise then. R6XX still indicates high end. If you were mentioning mid range and lower, indicate instead RV6XX then it's be more understood and less confrontation. A little bit does go a long way

This post has been edited by almostthere: May 7 2007, 04:52 PM
TSikanayam
post May 7 2007, 04:52 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(empire23 @ May 7 2007, 03:44 AM)
If you wanna spoonfeed people. Don't spoon feed shit lah.
*
LOL. Owned.

And seriously banjarconverto, read the first post. i knew about those 65nm chips since last year.
banjarconverto
post May 7 2007, 04:54 PM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
Wah, so I was pretty wrong in any step to provide more info. Nevermind, it's a learning time. biggrin.gif biggrin.gif
empire23
post May 7 2007, 04:55 PM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory
QUOTE(banjarconverto @ May 7 2007, 04:49 PM)
OK, maybe I should have indicate R6XX, instead of R600 which indicates high end range. I apologize from the typo, but I didn't mean to provide shit whatever to you guys.

And your respond is undeniably rude.
*
Never act confident and mighty if you are not even sure of what you're typing, if you do, you're asking for a rude rebuttal.

Typos doesn't explain that because never has ATi claimed that it's whole line would automatically transition to the 65nm in their road map. Must be rumours from Fuddlizza or the theInq tongue.gif. Alot here don't actively assert they know and instead question because they're under NDA, which i am too. So AMD lawyers, i'm right here. Buttrape me.

No plans can be made without silicon, thus until the 65nm TSMC proves viable for high speed, the yields good enough and the production line able to cope with the load. No, no plans until then.
jinaun
post May 7 2007, 04:55 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
i think i saw a R600 demo unit in compuzone yesterday...

8pin + 6pin PCIE Aux power.. rite?? and and red flamy plastic shroud covering the heatsink

This post has been edited by jinaun: May 7 2007, 04:56 PM
almostthere
post May 7 2007, 04:56 PM

Kepala abah ko
Group Icon
VIP
3,773 posts

Joined: Jan 2003
From: Anywhere lah...as long got Kopi-O



QUOTE(jinaun @ May 7 2007, 04:55 PM)
i think i saw a R600 demo unit in compuzone yesterday...
*
I'd doubt that's a R600 unless what I've been told about availabillity is totally FUD
empire23
post May 7 2007, 04:58 PM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory
QUOTE(jinaun @ May 7 2007, 04:55 PM)
i think i saw a R600 demo unit in compuzone yesterday...
*
Saw those, unless specifically run, most of them are just show models. But here's a tip, MSI and Powercolor have awesome coolers, silent looking ones too. Usus, entah, they only sent a box to look at, a giant box.

Hardware isn't here, not by a long shot.
banjarconverto
post May 7 2007, 05:03 PM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
Okay, but I did came across a news about how AMD action to release R600 (yes!) in 80-nm flavour first, and then refresh the high end range later with 65-nm process, which created some sort of dissatisfaction among forumers of the news. I'm sorry I can't remember which news nor provide any links, but it's so damning that it stuck in my mind, which lead me to my previous posts.

However, you guys know better. I just confidently thought my info would help. And no offence to the TS either.. smile.gif smile.gif
jinaun
post May 7 2007, 05:06 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(empire23 @ May 7 2007, 04:58 PM)
Saw those, unless specifically run, most of them are just show models. But here's a tip, MSI and Powercolor have awesome coolers, silent looking ones too. Usus, entah, they only sent a box to look at, a giant box.

Hardware isn't here, not by a long shot.
*
yeah.. they are running 3dmark06..

at the scene where one heavy armored guy/w heavy firepower shooting at those pathetic people behind those boxes.. averaging abt 55~fps at that point

if its not R600.. then its something else disguised to look like R600.. LOL

This post has been edited by jinaun: May 7 2007, 05:11 PM
empire23
post May 7 2007, 05:09 PM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory
QUOTE(banjarconverto @ May 7 2007, 05:03 PM)
Okay, but I did came across a news about how AMD action to release R600 (yes!) in 80-nm flavour first, and then refresh the high end range later with 65-nm process, which created some sort of dissatisfaction among forumers of the news. I'm sorry I can't remember which news nor provide any links, but it's so damning that it stuck in my mind, which lead me to my previous posts.

However, you guys know better. I just confidently thought my info would help. And no offence to the TS either.. smile.gif  smile.gif
*
I think i wrote about that as an article. I specifically stated that i was hopeful because it's been ATi's habit to do so, for the past few generations. But IIRC, i also clearly stated that it one of those "might be" things. But it's seriously not a fact until ATI/AMD can get the silicon characteristics needed, although we can pray.


Added on May 7, 2007, 5:13 pm
QUOTE(jinaun @ May 7 2007, 05:06 PM)
yeah.. they are running 3dmark06.. 

at the scene where one heavy armored guy shooting at those pathetic people behind those boxes.. averaging abt 55~fps at that point

if its not R600.. then its something else disguised to look like R600.. LOL
*
Talked to Jerry Teng about it during the press day, he says there are 2 working units given out for demos, and all partners have strict orders to wait for the magic date before they can actually release their stock.

But ATI PR being notoriously inefficient, *grumble....stoopid slides*, i'd rather not trust them sometimes. IT'S BEEN 7 DAYS YOU LAZY TARDS!



This post has been edited by empire23: May 7 2007, 05:13 PM
redbull_y2k
post May 7 2007, 06:20 PM

"Viva La Revolucion!"
******
Senior Member
1,469 posts

Joined: Jan 2003
From: YOU.ESS.JAY



QUOTE(jinaun @ May 7 2007, 05:06 PM)
yeah.. they are running 3dmark06.. 

at the scene where one heavy armored guy/w heavy firepower shooting at those pathetic people behind those boxes.. averaging abt 55~fps at that point

if its not R600.. then its something else disguised to look like R600.. LOL
*
They had the HD2900XT hardware sample since last week.
Joseph Hahn
post May 7 2007, 09:03 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
Uh so... XTX is not a vaporware then, it's just delayed but will it be 80nm or 65nm? I hope 65nm since they are delaying so much already. doh.gif But that would be R650 already no ? I'm so blur whether to wait or not... sheesh. I really want the highest end because i won't be upgrading my PC for at least 2 years after this. :/
HaHaNoCluE
post May 7 2007, 09:09 PM

Newbie
****
Senior Member
628 posts

Joined: Oct 2006


scanart selling gecube x2900xt 512ddr3 for rm1,799.00... it seems a bit expensive though...
almostthere
post May 7 2007, 09:11 PM

Kepala abah ko
Group Icon
VIP
3,773 posts

Joined: Jan 2003
From: Anywhere lah...as long got Kopi-O



QUOTE(Joseph Hahn @ May 7 2007, 09:03 PM)
Uh so... XTX is not a vaporware then, it's just delayed but will it be 80nm or 65nm? I hope 65nm since they are delaying so much already. doh.gif But that would be R650 already no ? I'm so blur whether to wait or not... sheesh. I really want the highest end because i won't be upgrading my PC for at least 2 years after this. :/
*
for the LAST time, repeat after me, "R600 WILL BE 80nm", expect no less then that. And I can bet 3 pieces of ikan bawal masak sambal it won't occur during R600's lifetime
shady
post May 7 2007, 09:14 PM

The Real Slim Shady
*******
Senior Member
2,818 posts

Joined: Jan 2003
From: Hokkaido, Japan



HD 2900XT is over my budget. Hoping HD 2900XL would offer good performance at 8800GTS 320MB price range.
HaHaNoCluE
post May 7 2007, 09:25 PM

Newbie
****
Senior Member
628 posts

Joined: Oct 2006


some reviews shown that the x2900xt doesn't perform much different from 8800gts even the 320mb in 1680 x 1050 n lower... so rm1.7k is really a bit high...
derek87
post May 7 2007, 10:01 PM

Keep it C.L.E.A.N.
******
Senior Member
1,077 posts

Joined: Nov 2006
From: Sabah,Sandakan Status:STUNNED


Wah... why is Ati so slow..? i asked viewnet about X2000series. They say have to wait til june. SO i think those midrange X2k series should be even longer? ggrrrr

This post has been edited by derek87: May 7 2007, 10:02 PM
jinaun
post May 7 2007, 10:18 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(almostthere @ May 7 2007, 09:11 PM)
for the LAST time, repeat after me, "R600 WILL BE 80nm", expect no less then that. And  I can bet 3 pieces of ikan bawal masak sambal it won't occur during R600's lifetime
*
probably it will face the same fate at R520... quickly replaced with R580

i'll wait for R680+ then... LOL..
arjuna_mfna
post May 7 2007, 10:37 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



rm1.8 k?... argh... me ko already, maybe wait fow lower spec ati x2900xl
LExus65
post May 7 2007, 11:05 PM

Old Gezzer.....
******
Senior Member
1,995 posts

Joined: May 2005


wah x2900xt so expensive ??? it like 1950 xtx..........
sleepy
post May 7 2007, 11:33 PM

Regular
******
Senior Member
1,139 posts

Joined: Jan 2003
Hmm, 1799? So is the card available now from scanart? Or just the MSRP?
TSikanayam
post May 8 2007, 01:42 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(jinaun @ May 7 2007, 09:18 AM)
probably it will face the same fate at R520... quickly replaced with R580

i'll wait for R680+ then...  LOL..
*
R580 was not a huge change from R520. Same process, same architecture, they just tacked on 3 ALUs per pipe. But there will be no "R680" on the same process as R600, since it's about as big as it gets already. And i think i would have heard if there was going to be one. They're concentrating on their midrange chips this time because they will do well there, and they already have more orders than they can handle at this point. The next interesting one will be RV670...
banjarconverto
post May 8 2007, 08:37 AM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
QUOTE
HD 2900XT is over my budget. Hoping HD 2900XL would offer good performance at 8800GTS 320MB price range.


XL? Do we still gonna have XL moniker for this gen? The last one was X800XL I think, or was it X1800XL, can't really remember.. hmm.gif hmm.gif
arjuna_mfna
post May 8 2007, 09:02 AM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(banjarconverto @ May 8 2007, 08:37 AM)
XL? Do we still gonna have XL moniker for this gen? The last one was X800XL I think, or was it X1800XL, can't really remember.. hmm.gif  hmm.gif
*
it was x1800xl
crasher
post May 8 2007, 12:50 PM

Getting Started
**
Junior Member
209 posts

Joined: Jan 2003
From: Far Far Away Land.



Wow...look what i found, a sapphire hd2900xt exposed. This is probably the first custom design you will see from sapphire. Just look at the heatpipes and cooler...really cannot tahan. drool.gif

derek87
post May 8 2007, 01:21 PM

Keep it C.L.E.A.N.
******
Senior Member
1,077 posts

Joined: Nov 2006
From: Sabah,Sandakan Status:STUNNED


QUOTE(crasher @ May 8 2007, 12:50 PM)
Wow...look what i found, a sapphire hd2900xt exposed. This is probably the first custom design you will see from sapphire. Just look at the heatpipes and cooler...really cannot tahan. drool.gif
*
it looks temptative to me... thumbup.gif
i got a new slogan for ATI.

ATI

Makes You Wait

tongue.gif
arjuna_mfna
post May 8 2007, 02:32 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(crasher @ May 8 2007, 12:50 PM)
Wow...look what i found, a sapphire hd2900xt exposed. This is probably the first custom design you will see from sapphire. Just look at the heatpipes and cooler...really cannot tahan. drool.gif
*
damn nice the cooler... can wait it enter market...
arjuna_mfna
post May 8 2007, 04:37 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



see MSI HD2900XT here
http://www.fx57.net/?p=644
Zicron
post May 8 2007, 05:23 PM

Come to the Dark Side
*****
Senior Member
899 posts

Joined: Mar 2005
From: Klang


QUOTE(crasher @ May 8 2007, 12:50 PM)
Wow...look what i found, a sapphire hd2900xt exposed. This is probably the first custom design you will see from sapphire. Just look at the heatpipes and cooler...really cannot tahan. drool.gif
*
I guess they need that kind of cooler and heatpipes as 82 degrees C on the outside of the cooler is what been rumored sweat.gif
HaHaNoCluE
post May 8 2007, 05:53 PM

Newbie
****
Senior Member
628 posts

Joined: Oct 2006


QUOTE(sleepy @ May 7 2007, 11:33 PM)
Hmm, 1799? So is the card available now from scanart? Or just the MSRP?
*
it's in scanart latest price list... gecube x2900xt ddr3 512mb rm1799... i thought it was ddr4... but nope... too bad..
banjarconverto
post May 8 2007, 06:10 PM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
That ScanArt list is for dealers right?? Maybe it will be higher for the end customer sad.gif sad.gif
Chanwsan
post May 8 2007, 06:40 PM

सोहम
******
Senior Member
1,406 posts

Joined: Dec 2004
From: Living Hell


QUOTE(derek87 @ May 7 2007, 10:01 PM)
Wah... why is Ati so slow..? i asked viewnet about X2000series. They say have to wait til june. SO i think those midrange X2k series should be even longer? ggrrrr
*
u sure they said that? well i sure hope not, i was hoping that they release it earlier so the price war n dx9 GC price slash starts earlier... they keep on delay and they are reli killing their sales themselves..
salimbest83
post May 8 2007, 06:48 PM

♥PMS on certain day♥
*******
Senior Member
8,647 posts

Joined: Feb 2006
From: Jelutong Penang



QUOTE(Chanwsan @ May 8 2007, 06:40 PM)
u sure they said that? well i sure hope not, i was hoping that they release it earlier so the price war n dx9 GC price slash starts earlier... they keep on delay and they are reli killing their sales themselves..
*
yeah..when they at wars.. we the consumers will have really really good time

This post has been edited by salimbest83: May 8 2007, 06:49 PM
Joseph Hahn
post May 8 2007, 06:55 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
QUOTE(almostthere @ May 7 2007, 09:11 PM)
for the LAST time, repeat after me, "R600 WILL BE 80nm", expect no less then that. And  I can bet 3 pieces of ikan bawal masak sambal it won't occur during R600's lifetime
*
Sorry sarjan misai tajam... only 3 ikan bawal ? :/ Anyhow, do you have any idea when will the XTX version comes out ? sweat.gif I won't mind if it uses 400W since i'm getting a 1KW PSU anyway.

almostthere
post May 8 2007, 07:08 PM

Kepala abah ko
Group Icon
VIP
3,773 posts

Joined: Jan 2003
From: Anywhere lah...as long got Kopi-O



QUOTE(jinaun @ May 7 2007, 10:18 PM)
probably it will face the same fate at R520... quickly replaced with R580

i'll wait for R680+ then...  LOL..
*
and by then R700 arrives, and you wait, and you wait, and you wait....

come on man, make up your mind
jinaun
post May 8 2007, 07:21 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(almostthere @ May 8 2007, 07:08 PM)
and by then R700 arrives, and you wait, and you wait, and you wait....

come on man, make up your mind
*
i'll onli wait for the that generation cycle to matured b4 buying..

eg.. 7800 to 7900, or 1800 to 1900, etc etc...
arjuna_mfna
post May 8 2007, 09:08 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



Could this result true

source : http://r800.blogspot.com/2007/04/radeon-hd-2900xt_28.html



This post has been edited by arjuna_mfna: May 8 2007, 09:09 PM
banjarconverto
post May 9 2007, 12:42 AM

New Member
*
Junior Member
35 posts

Joined: Feb 2007
Seems there are more leaked benchmarks which indicates HD2900XT being very competitive or maybe better than 8800 GTS. Even DailyTech benchmarks also hinted on this.

The blog entries are full with R600 stories. One thing I don't fall into is 2900XT OCed beat 8800 GTX Ultra in 3DMark06.
jinaun
post May 9 2007, 07:18 AM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
fudzilla mentioned that :
QUOTE
R650 is the Radeon HD 2950XTX

ATI hasn't launched the R600 yet and it is already planning its next launch of a die shrinked version. It won't be launching the 1024 MB graphics card until Q3. It plans to launch the R650 chip and brand it as Radeon HD 2950XTX, 1024 MB GDDR4.

The chip is called R650 and it is a 65 nanometre core, how convenient. With this new chip ATI might have a fighting chance against the Geforce 8800 Ultra, but the real question is what will Nvidia have at that time.

The cards are scheduled for late Q3, so you should expect them to arrive in August or September, unless they get postponed.


http://www.fudzilla.com/index.php?option=c...id=871&Itemid=1
arjuna_mfna
post May 9 2007, 09:38 AM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



14 may, just few days to go... so far just hv info bout HD2900XT only, did they will intro lower version of HD2900 series, XL, PRO or GT version in short
Joseph Hahn
post May 9 2007, 01:42 PM

kpop k
*******
Senior Member
6,410 posts

Joined: Jan 2003
From: MLK
Bah i think i'll just go for a cheapass X1950 Pro while waiting for 2900XTX. :/

This post has been edited by Joseph Hahn: May 9 2007, 03:31 PM
LExus65
post May 9 2007, 04:16 PM

Old Gezzer.....
******
Senior Member
1,995 posts

Joined: May 2005


just got this info from PC Zone, asus x2900XT 512mb DDR3 @ RM1700++............... Sapphire no price yet but similar specs

it seems both are loosing a bit behind 8800GTS
riku2replica
post May 9 2007, 04:49 PM

Mugi-chan!! 可愛い!!
*******
Senior Member
3,304 posts

Joined: Mar 2006
From: Chicago(Port25)
Hey, i want it! But too bad, my mobo is single slot PCI-e x16 only...
bata
post May 9 2007, 07:01 PM

Look at all my stars!!
*******
Senior Member
3,726 posts

Joined: Sep 2005
QUOTE(LExus65 @ May 9 2007, 04:16 PM)
just got this info from PC Zone, asus x2900XT 512mb DDR3 @ RM1700++............... Sapphire no price yet but similar specs

it seems both are loosing a bit behind 8800GTS
*
wuiyoo so expensive...
they should priced it around 1500++,
so the price can be competitive with 8800GTS 640MB


Chow.
arjuna_mfna
post May 9 2007, 09:18 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(riku2replica @ May 9 2007, 04:49 PM)
Hey, i want it! But too bad, my mobo is single slot PCI-e x16 only...
*
u just need 1 pci-ex to run vga card, u want crossfire ar? as long your mobo hv pci-e 16x it compatible with current and upcoming vga card

just read some review bout hd2900xt, it lose to 8800gts
http://it-review.net/index.php?option=com_...d=1314&Itemid=1

This post has been edited by arjuna_mfna: May 9 2007, 09:53 PM
Nemesis181188
post May 9 2007, 11:02 PM

On my way
****
Senior Member
635 posts

Joined: Feb 2006
I was hoping it will be better then the gts.Sad news it seems.
LExus65
post May 9 2007, 11:17 PM

Old Gezzer.....
******
Senior Member
1,995 posts

Joined: May 2005


no detail specs yet so hard to tell........ well it depends on the market too i think.....cost asus and some other brand of 8800GTS 640mb is still around 1700 mark.....so the local distributor also aiming at the same range.....

i was hoping around the range of 1200 instead but haha...... no chance
blah1134
post May 10 2007, 05:50 AM

On my way
****
Senior Member
563 posts

Joined: Jan 2003


2900XL should cost around RM1200

wondering when will we get it here
sHawTY
post May 10 2007, 08:20 AM

Frequent Reporter
********
All Stars
14,909 posts

Joined: Jul 2005

Funny, ATI took a long time to launch their own DX10 cards, but ATI X2900XT still lose to 8800GTS 640MB. laugh.gif

ATI should have use that very long time to produce a more powerful card that can beat at least 8800GTS 640MB, but from what i see, ATI is weak. shakehead.gif

Now, where's all the ruckus about ATI in the same company with AMD? doh.gif

No need for me to change to ATI then. tongue.gif
arjuna_mfna
post May 10 2007, 08:20 AM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(blah1134 @ May 10 2007, 05:50 AM)
2900XL should cost around RM1200

wondering when will we get it here
*
it will enter the market in june, estimed price rounf usd279

QUOTE(sHawTY @ May 10 2007, 08:20 AM)
Funny, ATI took a long time to launch their own DX10 cards, but ATI X2900XT still lose to 8800GTS 640MB. laugh.gif

ATI should have use that very long time to produce a more powerful card that can beat at least 8800GTS 640MB, but from what i see, ATI is weak. shakehead.gif

Now, where's all the ruckus about ATI in the same company with AMD? doh.gif

No need for me to change to ATI then. tongue.gif
*
ya, me to... waiting so long for ati, if their last product perform like that maybe change "mazhab" to nvidia after this. so dispointed

This post has been edited by arjuna_mfna: May 10 2007, 08:33 AM
exhauster
post May 10 2007, 02:41 PM

Casual
***
Junior Member
427 posts

Joined: Sep 2006
From: KK



QUOTE(sHawTY @ May 10 2007, 08:20 AM)
Funny, ATI took a long time to launch their own DX10 cards, but ATI X2900XT still lose to 8800GTS 640MB. laugh.gif

ATI should have use that very long time to produce a more powerful card that can beat at least 8800GTS 640MB, but from what i see, ATI is weak. shakehead.gif

Now, where's all the ruckus about ATI in the same company with AMD? doh.gif

No need for me to change to ATI then. tongue.gif
*
from wat u hv say is nt true, try think it twice if the earth wifoyut ATI
the Nvidia can do watever he want,example : giving a 8800GTX wif 8500 spec but price 2K, we hv no choice n we must buy it
but if there is ATI can compete so that Nvidia only can release a powerful card
arjuna_mfna
post May 10 2007, 02:58 PM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(exhauster @ May 10 2007, 02:41 PM)
from wat u hv say is nt true, try think it twice if the earth wifoyut ATI
the Nvidia can do watever he want,example : giving a 8800GTX wif 8500 spec but price 2K, we hv no choice n we must buy it
but if there is ATI can compete so that Nvidia only can release a powerful card
*
it wasting amd money for buying ati, R&D cost making R600 if it just to make sure the vga price are competetive...
and one more, if amd produce vga card just for that reason better they just make mid range vga only no need to to make R&D for high end one, and and no need to delay to 6months to launch their vga

This post has been edited by arjuna_mfna: May 11 2007, 12:03 PM
Radeon
post May 11 2007, 10:46 AM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

QUOTE(exhauster @ May 10 2007, 02:41 PM)
from wat u hv say is nt true, try think it twice if the earth wifoyut ATI
the Nvidia can do watever he want,example : giving a 8800GTX wif 8500 spec but price 2K, we hv no choice n we must buy it
but if there is ATI can compete so that Nvidia only can release a powerful card
*
chill, ati just lost this round, ati did beat nvidia on the x800 round though,

QUOTE(arjuna_mfna @ May 10 2007, 02:58 PM)
it wasting amd money, buy ati, R&D cost if just to make sure the price no to high...
and one morw, if amd produce vga card just for that perpose better they just make mid range vga only no need to to make R&D for high end one, and and no need to delay to 6months to launch their vga
*
i dun understand what you are saying


judging by the it review net, the benchmark seems perculiar.
a performance increase can be seen only in company of heroes, i got no idea why, furthermore by a HUGE margin.

This post has been edited by Radeon: May 11 2007, 11:02 AM
bata
post May 11 2007, 12:42 PM

Look at all my stars!!
*******
Senior Member
3,726 posts

Joined: Sep 2005
read the whole article
its because some driver issues


Chow.
empire23
post May 11 2007, 05:48 PM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory

derek87
post May 11 2007, 10:00 PM

Keep it C.L.E.A.N.
******
Senior Member
1,077 posts

Joined: Nov 2006
From: Sabah,Sandakan Status:STUNNED


QUOTE(empire23 @ May 11 2007, 05:48 PM)

- The R600 is not even out with mature drivers (i have an X2300 on hand)
where did you get that X2300? show so benchies here pls... =) that will be so entertaining!
empire23
post May 11 2007, 10:09 PM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory
QUOTE(derek87 @ May 11 2007, 10:00 PM)
where did you get that X2300? show so benchies here pls... =) that will be so entertaining!
*
Asus lappie. I might discreetly post it later. Just got the lappie yesterday lah.

But here's the wierd part, benches are heavily driver independent (the driver reported to be in use is a Catalyst 8.31) and i get insanely low 3Dmark scores, too low to believe. So i think there are a few bugs about, but it generally functions just nicely in some games. But it chokes on others. I'm going to try low level tests (MADD, Triangle output and such) for the time being since they're far less driver dependent.

But AVIVO seems to work just fine, with hardware offloading across the lines for various media tested with it. And of course ATI's newly implemented Powerplay and some fancy power gear thinggy.

LittleLinnet
post May 11 2007, 10:52 PM

Iophobia
*******
Senior Member
3,593 posts

Joined: Feb 2005
From: ***Penang***
why does X2300 have something to do with R600?
I thought it is still based on R5XX architecture and ATi just renamed it
linux11
post May 11 2007, 11:17 PM

Getting Started
**
Junior Member
104 posts

Joined: Jan 2005
From: Seremban


Mobility Radeon X2300
http://ati.amd.com/products/mobilityradeonx2300/specs.html

it's a DX9 gpu.
empire23
post May 12 2007, 12:10 AM

Team Island Hopper
Group Icon
Staff
9,417 posts

Joined: Jan 2003
From: Bladin Point, Northern Territory
QUOTE(LittleLinnet @ May 11 2007, 10:52 PM)
why does X2300 have something to do with R600?
I thought it is still based on R5XX architecture and ATi just renamed it
*
Yeap, it seems i'm mistaken.

Although it using highline drivers is wierd (prerelease beta, because catalyst intaller said it was newer than the 7.4) . And ATI/AMD seems to be playing the "it might be this or that" game. With speculation saying it's sometimes an M64 or the next gen M71. Seriously at this time i can't tell, because lappie has developed fault in keyboard and is to be sent back tomorrow (seems the 1 to backspace and F1-F12 key don't work)
SUSdattebayo
post May 12 2007, 02:27 AM

Look at all my stars!!
*******
Senior Member
5,366 posts

Joined: Aug 2005


R600 has 512 bit, GDDR4, more stream proc. than 8800GTX hmm.gif

http://en.wikipedia.org/wiki/Radeon_R600#Chipset_table

awaiting for this, hope it will be capable to create whirlwind to blow buyers away from nvidia laugh.gif

if ATi still lose their market share in the DX10 cards market, will AMD be doomed unsure.gif
cstkl1
post May 12 2007, 02:29 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(dattebayo @ May 12 2007, 02:27 AM)
R600 has 512 bit, GDDR4, more stream proc. than 8800GTX hmm.gif

http://en.wikipedia.org/wiki/Radeon_R600#Chipset_table

awaiting for this, hope it will be capable to create whirlwind to blow buyers away from nvidia laugh.gif

if ATi still lose their market share in the DX10 cards market, will AMD be doomed unsure.gif
*
cool card
just realised something
ppl gaming on vista has no choice now to go 4gb kekekke on vista 64 if they want to even game with the xtx crossfire..

This post has been edited by cstkl1: May 12 2007, 02:40 AM
GeneralX
post May 12 2007, 01:46 PM

Newbie
****
Senior Member
577 posts

Joined: Jan 2007
From: Random


Found an interesting article ... the R700

Wikipedia R700

TSikanayam
post May 12 2007, 01:59 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(GeneralX @ May 12 2007, 12:46 AM)
Found an interesting article ... the R700

Wikipedia R700
*
Some things in the article don't make sense.

1. It references [T]ardOCP. Twice. That particular article they referenced has less credibility than even Inquirer's rumors. It was based on the fanboi opinion of an idiot who knows nuts about chip design.

2. If R700 was made for a Q1'08 launch, it would not inherit anything from Barcelona. It would be too far along in development to inherit much from AMD at this point, especially not something as fundamental as split power planes. If it does have something like that, it's because it was designed that way.

3. I personally doubt it would be 65nm, unless they go very conservative with it. My guess is 55nm (or smaller depending on when they are targeting to release).
gtoforce
post May 13 2007, 05:18 AM

SPAM AND BECOME A SENIOR MEMBER
*******
Senior Member
2,967 posts

Joined: May 2006



anything after september is still 65nm
haha
cuz the plan to reduce the size of graphic card aint gonna move til this sept...i heard they are introducing new chip size by that time

oh...and i think id rather wait for the 2950pro
TSikanayam
post May 13 2007, 07:50 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

R700 won't be around very soon (i highly doubt Q1'08, but then again... hehe).

2950 pro won't be here so soon either. Assuming it is the RV670. But yes that may turn out to be an interesting chip... as well as G92...

This post has been edited by ikanayam: May 13 2007, 07:51 AM
Radeon
post May 13 2007, 08:27 AM

Semi-Retired Overclocker
*******
Senior Member
2,257 posts

Joined: Jan 2003

more spoilers for the xtx oem
the card is shown as XT instead of XTX though, not sure if it's a driver issue.

Default XTX clock
» Click to show Spoiler - click again to hide... «


OC
» Click to show Spoiler - click again to hide... «


X-Fire
» Click to show Spoiler - click again to hide... «


Source:
http://www.xtremesystems.org/forums/showth...t=143915&page=3

This post has been edited by Radeon: May 13 2007, 08:27 AM
cstkl1
post May 13 2007, 09:30 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

err those scores are lower than a gtx...
first review by the man shamino.
http://www.vr-zone.com/?i=4946&s=1


Added on May 13, 2007, 11:12pm
ultra stock lost to OCed air cooled x2900xt
http://r800.blogspot.com/2007/05/3dmark06-8800ultra-vs-2900xt.html

current world record of 3dmark05 - X2900xt!!
http://img46.imageshack.us/img46/4594/30kni1.jpg


Added on May 14, 2007, 3:13 amcstkl1 score tongue.gif
x6800@3.317 , 473*7
EVGA 680 mobo,
2x1gb Crucial Ballistix Tracer - 950mhz CL4 - 2.2v
8800gtx 575/1350/900 (Stock)
3dmark06 = 11086.

Shamino VrZone
x6800@3.3 ,366*9
ASUS P5K Deluxe (Intel P35 Chipset)
2 x 1GB GSkill CL5DDR2 915MHz, 5:4 Divider
HD2900xt 880/1030 (742/828 - Stock)
3dmark06 = 11896!!!...


p.s. take note the 8800gtx was not running on the lastest driver which would have given it another 500 or so more points based on my other clocks with the 8800gts on that driver.

This post has been edited by cstkl1: May 14 2007, 03:27 AM
jarofclay
post May 15 2007, 11:27 AM

Klipsch Addict
Group Icon
VIP
2,068 posts

Joined: Jan 2003
From: Ipoh / Penang / PJ


Just to share some benchies... the HD2900xt was running with very early drivers so do take that in mind.

Firingsquad
Hornet
post May 15 2007, 11:59 PM

What?
*******
Senior Member
4,251 posts

Joined: Jan 2003
From: Malacca, Malaysia, Earth


Hmm... judging from that DirectX 10 benchmark (guru3d), I think the HD29xt kinda ok when comes to DX10, had it came out 6 months earlier that is. and its power consumption isnt too nice.

Anyway some more mature driver would probably help it, performance wise.

This post has been edited by Hornet: May 16 2007, 12:01 AM
jarofclay
post May 16 2007, 04:58 PM

Klipsch Addict
Group Icon
VIP
2,068 posts

Joined: Jan 2003
From: Ipoh / Penang / PJ


Fishchicken: Need to ask you a little... is the lack of gaming performance currently on the R600 mainly attributed to drivers and if so, do you foresee it beating the 8800gtx/ultra with R600 having mature drivers on?

...coz if you ask me, the specs on the R600 is so forward looking.

Thanks.
TSikanayam
post May 16 2007, 05:08 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

Hm... it's quite "meh" to me. I think at least gaming wise, NV got it right for the high end this time. Midrange may be a different story altogether.

There is too little information to judge right now. The drivers look quite lame at this point, so new drivers may help quite a bit, especially with the AA modes since they seem to be doing the AA via shaders. But G80 drivers certainly aren't fully optimized either. I would not rush out and buy either one right now. Dx10 games are still nonexistant. I would wait 3-4 months till the driver situation works out a bit and maybe a few dx10 games are available before judging.
almostthere
post May 16 2007, 05:09 PM

Kepala abah ko
Group Icon
VIP
3,773 posts

Joined: Jan 2003
From: Anywhere lah...as long got Kopi-O



QUOTE(jarofclay @ May 16 2007, 04:58 PM)
Fishchicken: Need to ask you a little... is the lack of gaming performance currently on the R600 mainly attributed to drivers and if so, do you foresee it beating the 8800gtx/ultra with R600 having mature drivers on?

...coz if you ask me, the specs on the R600 is so forward looking.

Thanks.
*
that's what everyone in the tech scene is trying to figure out as for something that's late into the game, it's got pretty mediocre results for now. I suspect drivers too since it looks like R520 redux so it's better to wait till someone has soemthing good to report back
jinaun
post May 26 2007, 04:12 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE
Radeon HD 2900 XT lacks UVD video acceleration
by Scott Wasson - 01:38 pm, May 25, 2007

I've just learned something that compels me to publish a major correction to our review of the Radeon HD 2900 XT GPU. I got the clear, distinct impression from AMD's presentations, statements, and conversations with me at its Radeon HD press event that its new UVD video decode acceleration logic was present in its entire lineup of Radeon HD graphics chips, and I relayed that information to you in our review of the product, promising to follow up with tests of this feature at a later date.

True to my word, I set out yesterday to test HD video decode acceleration on a Radeon HD 2900 XT using an HD DVD drive and a version of PowerDVD supplied by AMD for such purposes. To my surprise, CPU utilization during playback on our Core 2 Extreme X6800 test system ran between 40 and 50%, well above what one would expect from a solution with full HD video decode acceleration.

Naturally, I contacted AMD to inquire about the problem. I received a reply from AMD's David Baumann discussing the issue that ended with this revelation:

    Be aware, though, that HD 2900 XT itself does not feature UVD, this is specific to 2600 and 2400, so the levels of CPU utilization you see should be somewhat similar to the previous generation.

The UVD logic handles the CPU-intensive bitstream processing and entropy decode acceleration portions of the HD video playback pipeline. These are the most notable video decode acceleration capabilities that would separate the Radeon HD 2900 series from its direct competition, the GeForce 8800 series, if the HD 2900 XT actually had them. Turns out it does not. As the email states, the video playback capabilities of the Radeon HD 2900 XT are essentially similar to those of the previous-gen Radeon X1950.

So the essence of our correction is that the Radeon HD 2900 XT doesn't offer robust acceleration of HD video playback and will not likely reduce CPU utilization or power consumption substantially during high-definition video playback versus a GeForce 8800. We still intend to follow up with testing, but the lack of UVD logic on the GPU resets our expectations dramatically.

With that out of the way, I believe I ought to take a moment to explain how we came to believe the Radeon HD 2900 XT had full HD video playback acceleration, an impression formed by many layers of talk from AMD, starting with the Radeon HD name. Let me share a slide with you from a presentation titled "ATI Radeon HD 2000 Series and the Ultimate HD Experience," given by AMD's David Cummings, Director of Mobile GPU Marketing. The slide looks like so:

You can, of course, read for yourself that it says "Avivo HD technology makes full spec HD DVD / Blu-Ray (HD Disc) playback accessible at all price points," but I just like repeating it. That gives one a certain idea, does it not? Now, let's have a look at another slide showing what Avivo HD brings to video decode acceleration:

The bit labeled "Avivo HD" shows GPU acceleration of bitstream processing and entropy decode, and makes clear it's distinct from the Radeon X1000's Avivo video processing, which lacks acceleration of those stages.

Now, look at any specs list for the Radeon HD 2900 XT-say, this one from AMD's website, and you will find listed among its specs "ATI Avivo(tm) HD Video and Display Platform" and a bullet point under that saying "HD decode acceleration for H.264/AVC, VC-1, DivX and MPEG-2 video formats." At the end of the day, one gets the impression that this GPU has Avivo HD, with all that entails.

Of course, AMD has left itself some wiggle room in its technical statements. The specs list above isn't technically untrue-just imprecise. The dodge built into the Cummings presentation, with its talk of making HD video playback "accessible at all price points" seems to be that high-end CPUs can handle HD video playback without as much assistance from the GPU. But that's a paper thin excuse, in my view.

To make sure this wasn't simply a matter of me missing the boat-it has been known to happen, and I've got a few gray hairs promising more of the same in the future-I checked with a couple of other journalists who attended a separate Radeon HD press event the week after the one I attended. Both Marco Chiappetta from HotHardware and Ryan Shrout of PC Perspective came away from their meetings with AMD convinced the Radeon HD 2900 XT had full HD playback acceleration via UVD logic, as well. I was not alone in gathering this impression from AMD. To their credit, some reviewers did sort through the fog and identify the Radeon HD 2900 XT's lack of UVD, but they were swimming against the tide of statements from AMD itself.

Nor could any of us have uncovered this fact prior to the publication of our reviews via testing, because AMD hasn't yet delivered a driver that includes the support for the Radeon HD's "full" multimedia capabilities. They initially targeted May 9 for that driver's release. AMD now says the driver is due next week.


http://www.techreport.com/onearticle.x/12552

will it be an issue or not...?

This post has been edited by jinaun: May 26 2007, 04:13 PM
X.E.D
post May 27 2007, 12:32 PM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


QUOTE(jinaun @ May 26 2007, 05:12 PM)
http://www.techreport.com/onearticle.x/12552

will it be an issue or not...?
*
Not really.
Did that author get paid off by nVidia? Seems to make a HUGE deal out of losing entropy decoding/bitstream processing. 8800 doesn't have'em too, practically keeping them at an even playing level for now.

Marketing BS, but this wouldn't even compare to the magnitude that Creative has done with their Audigy and X-Fi cards. Those are much easier to get a lawsuit on than this one.

The important factor will be the redux flagship chips (HD2950XT+/8900+), in which demanding full AVIVO HD/Purevideo HD v2 would be reasonable.

More importantly, hi-def video is more prevalent on H/X264 files than HD-DVD/BD, and there's a very nut-zy codec called CoreAVC that high-end GC users could buy without even blinking their eyes. 1080p H.264 fulfilled!

This post has been edited by X.E.D: May 27 2007, 12:32 PM
t3chn0m4nc3r
post May 27 2007, 12:41 PM

Teh Necron Lord
*******
Senior Member
4,139 posts

Joined: Sep 2006
From: Internet


most probably not.... juz wait and see or AMD will face a great pressure...
ruffstuff
post May 27 2007, 04:58 PM

Look at all my stars!!
*******
Senior Member
3,345 posts

Joined: Jan 2003
QUOTE(X.E.D @ May 27 2007, 12:32 PM)
Not really.
Did that author get paid off by nVidia? Seems to make a HUGE deal out of losing entropy decoding/bitstream processing. 8800 doesn't have'em too, practically keeping them at an even playing level for now.

Marketing BS, but this wouldn't even compare to the magnitude that Creative has done with their Audigy and X-Fi cards. Those are much easier to get a lawsuit on than this one.

The important factor will be the redux flagship chips (HD2950XT+/8900+), in which demanding full AVIVO HD/Purevideo HD v2 would be reasonable.

More importantly, hi-def video is more prevalent on H/X264 files than HD-DVD/BD, and there's a very nut-zy codec called CoreAVC that high-end GC users could buy without even blinking their eyes. 1080p H.264 fulfilled!
*
At least NVIDIA made it less vague statement compare to AMD. On nvidia site, they mention about all Geforce 8 series are having Purevideo HD technology with a small notation below saying only 8500 and 8600 are supported. Still confusing, but at least the press are aware before even the launch of mainstream Geforce 8 series regarding missing HD decoder on G80 series. No broken promise happened.
Made me think, launching both R600 and RV6XX on the same time kinda messed up things a little bit.
blindbox
post May 28 2007, 12:10 AM

Meh
******
Senior Member
1,705 posts

Joined: Nov 2004


QUOTE(X.E.D @ May 27 2007, 12:32 PM)
Not really.
Did that author get paid off by nVidia? Seems to make a HUGE deal out of losing entropy decoding/bitstream processing. 8800 doesn't have'em too, practically keeping them at an even playing level for now.

Marketing BS, but this wouldn't even compare to the magnitude that Creative has done with their Audigy and X-Fi cards. Those are much easier to get a lawsuit on than this one.

The important factor will be the redux flagship chips (HD2950XT+/8900+), in which demanding full AVIVO HD/Purevideo HD v2 would be reasonable.

More importantly, hi-def video is more prevalent on H/X264 files than HD-DVD/BD, and there's a very nut-zy codec called CoreAVC that high-end GC users could buy without even blinking their eyes. 1080p H.264 fulfilled!
*
Uh, I have CoreAVC codec right here, included with k-lite mega codec pack, hmm. I wonder..... was my way teh *ahem* way? lol.

This post has been edited by blindbox: May 28 2007, 12:14 AM
jinaun
post Dec 16 2007, 03:55 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
leaky R680 proto boards???



sos: http://en.expreview.com/?p=115


Attached thumbnail(s)
Attached Image Attached Image
ben_panced
post Dec 16 2007, 04:09 PM

PC and MotorBicycle Enthusiast
*****
Senior Member
962 posts

Joined: Dec 2004
From: Kulai


where's the ram chip..
cant see any rclxub.gif
jinaun
post Dec 16 2007, 04:31 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(ben_panced @ Dec 16 2007, 04:09 PM)
where's the ram chip..
cant see any  rclxub.gif
*
its embedded into the GPU with total bus width of 2560-bit (composed of three independent buses: 1024-bit write, 1024-bit read, 512-bit read/write) working at 8 GHz

» Click to show Spoiler - click again to hide... «


This post has been edited by jinaun: Dec 16 2007, 04:40 PM
clayclws
post Dec 16 2007, 04:50 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(jinaun @ Dec 16 2007, 04:31 PM)
its embedded into the GPU with total bus width of 2560-bit (composed of three independent buses: 1024-bit write, 1024-bit read, 512-bit read/write) working at 8 GHz
*
Haha~! If only it's true...Guess it's going to take a 12nm process or smaller for that to happen. Now that'll be cool wink.gif
[attachmentid=360983][attachmentid=360985]
The blue squares are where the GDDR3/4 are supposed to be.

This post has been edited by clayclws: Dec 16 2007, 05:00 PM
clayclws
post Dec 17 2007, 07:40 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


I wonder if it's worth the wait...January ain't that far...but 31st January is still January...
[attachmentid=361794]
[attachmentid=361795]
[attachmentid=361796]
smokey
post Dec 17 2007, 08:13 PM

Infinity speed
*******
Senior Member
3,506 posts

Joined: Jan 2003
From: Lumpur
dunno january which date...haiz...if we know, then we can prepare for cheaper hd3850 and hd3870...
clayclws
post Dec 17 2007, 08:16 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


I'm worried about the GDDR3 usage...it's not as energy efficient or OC-able as GDDR4. They've used GDDR4...why are they taking a step back for their flagship product? Hope it's not accurately reported.

This post has been edited by clayclws: Dec 17 2007, 08:16 PM
jinaun
post Dec 17 2007, 11:09 PM

where are my stars???
Group Icon
Elite
6,139 posts

Joined: Jan 2003
QUOTE(clayclws @ Dec 17 2007, 08:16 PM)
I'm worried about the GDDR3 usage...it's not as energy efficient or OC-able as GDDR4. They've used GDDR4...why are they taking a step back for their flagship product? Hope it's not accurately reported.
*
perhaps GDDR3 is much cheaper.. and clock for clock.. is faster than GDDR4 due to additional latencies associated with GDDR4

GDDR4 will onli be faster than GDDR3 if its clocked high enough..

eg.. take DDR400 vs DDR2-400
arjuna_mfna
post Dec 18 2007, 02:25 AM

**Towards Justice World**
******
Senior Member
1,496 posts

Joined: Jan 2006
From: Baling, Kedah



QUOTE(ben_panced @ Dec 16 2007, 04:09 PM)
where's the ram chip..
cant see any  rclxub.gif
*
this photo was board sample (with no ram chip) this is unfinish product...
clayclws
post Dec 18 2007, 10:39 AM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(jinaun @ Dec 17 2007, 11:09 PM)
perhaps GDDR3 is much cheaper.. and clock for clock.. is faster than GDDR4 due to additional latencies associated with GDDR4

GDDR4 will onli be faster than GDDR3 if its clocked high enough..

eg.. take DDR400 vs DDR2-400
*
Hmm...I have never seen such thing happening in the Graphics RAM department before. I always have the impression that GDDR4 is a better spec-ed GDDR3 - an improvement in energy efficiency, increased official speed, etc. In other words, it is like Vios 2007 compared to the ordinary Vios. I guess I better go dig up more from my friends and web.
skylinegtr34rule4life
post Dec 18 2007, 12:52 PM

13k elite :P
********
Senior Member
13,340 posts

Joined: Feb 2005
From: back from vacation XD



QUOTE(smokey @ Dec 17 2007, 08:13 PM)
dunno january which date...haiz...if we know, then we can prepare for cheaper hd3850 and hd3870...
*
900 bucks is consider so cheap ardy la compare with 1500 bucks 4 crappy 2900XT laugh.gif icon_rolleyes.gif
TSikanayam
post Dec 18 2007, 01:01 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(clayclws @ Dec 17 2007, 09:39 PM)
Hmm...I have never seen such thing happening in the Graphics RAM department before. I always have the impression that GDDR4 is a better spec-ed GDDR3 - an improvement in energy efficiency, increased official speed, etc. In other words, it is like Vios 2007 compared to the ordinary Vios. I guess I better go dig up more from my friends and web.
*
Looks like you probably didn't read what you linked to. GDDR4 saves power and clocks higher not by magic, the internal memory elements are clocked at half the speed of comparable GDDR3, so latency is higher.

This post has been edited by ikanayam: Dec 18 2007, 01:04 PM
sonic_cd
post Dec 18 2007, 02:25 PM

Friendship Is Magic
********
All Stars
19,042 posts

Joined: Jan 2003
From: Soleanna

QUOTE(skylinegtr34rule4life @ Dec 18 2007, 12:52 PM)
900 bucks is consider so cheap ardy la compare with 1500 bucks 4 crappy 2900XT laugh.gif  icon_rolleyes.gif
*
and not so power hungry .lol
clayclws
post Dec 19 2007, 04:38 PM

Look at all my stars!!
*******
Senior Member
2,659 posts

Joined: Sep 2006
From: Miri, PJ & KL


QUOTE(ikanayam @ Dec 18 2007, 01:01 PM)
Looks like you probably didn't read what you linked to. GDDR4 saves power and clocks higher not by magic, the internal memory elements are clocked at half the speed of comparable GDDR3, so latency is higher.
*
My bad. Bad reading comprehension. Was looking for the advantage bits only.

This post has been edited by clayclws: Dec 19 2007, 06:45 PM

 

Change to:
| Lo-Fi Version
0.0847sec    0.42    6 queries    GZIP Disabled
Time is now: 21st December 2025 - 01:12 AM