Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed

Outline · [ Standard ] · Linear+

 Nvidia Cheats~

views
     
TSrsangel
post Sep 5 2007, 10:54 AM, updated 19y ago

No Smoking pls @_@
******
Senior Member
1,487 posts

Joined: Dec 2006
From: Ja Bee


http://www.dailytech.com/AMD+Alleges+NVIDI...article8608.htm

How you guys think about this news from Tech daily? Turstable?
sniper69
post Sep 5 2007, 12:35 PM

.: One Shot One Kill :. .+|Level 9 Type Shit|+.
*******
Senior Member
7,173 posts

Joined: Jan 2003
From: PCH


alaa... easyla... nVIDIA's IQ is not good, ATi's IQ is way much better, no seriously... icon_idea.gif, you guys should know it already...
SlayerXT
post Sep 5 2007, 12:39 PM

PRIDE!
*******
Senior Member
2,042 posts

Joined: Jan 2003
From: KL



That was for video encoding right? Not really care, im always into gaming...
sniper69
post Sep 5 2007, 12:42 PM

.: One Shot One Kill :. .+|Level 9 Type Shit|+.
*******
Senior Member
7,173 posts

Joined: Jan 2003
From: PCH


QUOTE(§layerXT @ Sep 5 2007, 12:39 PM)
That was for video encoding right? Not really care, im always into gaming...
*
yeap indeed...
but in both (gaming & video rendering), ATi's always better, i don't know why, i don't know the reason... but it was there since Radeon 8 series tongue.gif
ff7yta
post Sep 5 2007, 12:55 PM

Rookie
******
Senior Member
1,091 posts

Joined: Oct 2005


Could be Nvidia releases drivers which draws faster while ATI goes for the drivers which draws more accurately.

If so, then Nvidia will always score higher scores in benchmarks with not-too-visible graphics quality reduction.

QUOTE
"As we openly told reviewers, using aggressive noise reduction settings may cause ghosting depending on the content played so we recommend using moderate settings. We also recommend the improved 163.44 drivers release a few weeks ago which reduce this effect," Allen said.

Maybe this shows what Nvidia was really up to with its improved drivers.

This post has been edited by ff7yta: Sep 5 2007, 12:56 PM
cks2k2
post Sep 5 2007, 04:01 PM

...
******
Senior Member
1,966 posts

Joined: Jan 2003
From: No longer hanging by a NUS

ATI -> Quack.exe
NV -> 3DMark

The pot calling the kettle black.
Renovatio
post Sep 5 2007, 04:55 PM

~ Enthusiast low on cash ~
******
Senior Member
1,942 posts

Joined: Nov 2005
From: Penang


sounds like a bug on Nvidia side, and their latest driver's just trying to mask the problem. But then again, not that I would even care, since I only use the lowly 6600GT smile.gif
HaHaNoCluE
post Sep 5 2007, 05:16 PM

Newbie
****
Senior Member
628 posts

Joined: Oct 2006


but recently the 2900xt does score very well in 3d mark but doesn't perform very well in games...
obefiend
post Sep 5 2007, 06:21 PM

Selamat Hari Raya Aidilluminati
*****
Senior Member
863 posts

Joined: Mar 2007
From: Tanjung Segitiga Masonic Lodge



QUOTE(HaHaNoCluE @ Sep 5 2007, 05:16 PM)
but recently the 2900xt does score very well in 3d mark but doesn't perform very well in games...
*
i notice this too. the 2900 xt kicks 8800GTX ass but when playing bioshock the ati card lags by at least 10 FPS
Eoma
post Sep 5 2007, 08:20 PM

- ,. -
Group Icon
Elite
4,603 posts

Joined: Jan 2003
From: PJ


QUOTE(sniper69 @ Sep 5 2007, 12:42 PM)
yeap indeed...
but in both (gaming & video rendering), ATi's always better, i don't know why, i don't know the reason... but it was there since Radeon 8 series tongue.gif
*
Used to be, where the AF quality of Nvidia cards were inferior to ATI's. That all changed with the 7-series iianm. In terms of 3D rendering, they're both pretty much equal now.
Breaktru
post Sep 5 2007, 08:44 PM

== The World ==
******
Senior Member
1,769 posts

Joined: Jan 2003
From: Malaysia


QUOTE(Eoma @ Sep 5 2007, 08:20 PM)
Used to be, where the AF quality of Nvidia cards were inferior to ATI's. That all changed with the 7-series iianm. In terms of 3D rendering, they're both pretty much equal now.
*
The AF quality for both NV and ATi had only become equal with the 8-series . Alot had complained 7-series quality , which even worse than 6-series .
HaHaNoCluE
post Sep 5 2007, 08:51 PM

Newbie
****
Senior Member
628 posts

Joined: Oct 2006


tried the 2600xt ddr4, it's a complete suicide while gaming with AA + AF on... it's very very low FPS n the image quality is really annoying... some 2900xt users reported same issues with their gpu... whenever the AA + AF is on, the card is dying... (i was impressed with their 3dmark score but given up on the idea to change to 2900xt after some comments on gaming)... i dunwan to play games without AA + AF on...
Eoma
post Sep 5 2007, 10:52 PM

- ,. -
Group Icon
Elite
4,603 posts

Joined: Jan 2003
From: PJ


QUOTE(Breaktru @ Sep 5 2007, 08:44 PM)
The AF quality for both NV and ATi had only become equal with the 8-series . Alot had complained 7-series quality , which even worse than 6-series .
*
Ahh cool. Thanks for the correction. smile.gif
ikanayam
post Sep 5 2007, 11:48 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(Breaktru @ Sep 5 2007, 07:44 AM)
The AF quality for both NV and ATi had only become equal with the 8-series . Alot had complained 7-series quality , which even worse than 6-series .
*
In fact, the 8 series AF quality is better than any ATI card now. NV did a really good job there.
AoiNatsume
post Sep 5 2007, 11:53 PM

Certified Noob
******
Senior Member
1,051 posts

Joined: Feb 2005
From: Somewhere Out There



yea, they did a good job there, and its about the ONLY thing where nvidia card kicks 2900XT's ass. Other than AA/AF issue, judging from other factors i dont see why 2900XT should be left behind at all. And the difference between 2900XT and 8800GTX is HUGE.
ikanayam
post Sep 6 2007, 12:05 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(AoiNatsume @ Sep 5 2007, 10:53 AM)
yea, they did a good job there, and its about the ONLY thing where nvidia card kicks 2900XT's ass. Other than AA/AF issue, judging from other factors i dont see why 2900XT should be left behind at all. And the difference between 2900XT and 8800GTX is HUGE.
*
Well as far as i can see, NV did such a good job and/or ATI did such a bad one that they had to price their top end card one step lower than NV's top end card. And that says a lot. Clearly this was NOT what they had in mind for R600. G80 seems to be the better chip in many ways. Why shouldn't the 2900XT be left behind? Because its paper specs look impressive?
The Scent LYN
post Sep 6 2007, 01:42 AM

Casual
***
Junior Member
387 posts

Joined: May 2007
From: Pahang,Sentul,MMU Malacca




I'm not sure whether this is relevant to this topic or not.

http://enthusiast.hardocp.com/article.html...W50aHVzaWFzdA==
X.E.D
post Sep 6 2007, 05:05 PM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


QUOTE(The Scent LYN @ Sep 6 2007, 01:42 AM)
I'm not sure whether this is relevant to this topic or not.

http://enthusiast.hardocp.com/article.html...W50aHVzaWFzdA==
*
Bioshock is an Nvidia "The Way It's Meant To Be Played Optimized" game.

Wait for Cat 7.9 or 8.0 when ATI can get some real optimizations.


Oh, and [T]ardOCP is heavily nvidia-slanted, just see the forums and you'll know.
Bioshock should be faster on the HD2900XT than the 8800ULTRA in DX9.
SlayerXT
post Sep 6 2007, 09:15 PM

PRIDE!
*******
Senior Member
2,042 posts

Joined: Jan 2003
From: KL



Any card is meant to be played as long as it can provide more than 30fps continuously.
X.E.D
post Sep 6 2007, 10:25 PM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


QUOTE(§layerXT @ Sep 6 2007, 09:15 PM)
Any card is meant to be played as long as it can provide more than 30fps continuously.
*
No, nVidia gave 2K Boston/Australia money and tech and marketing so that they'd optimize DX10 for nVidia GPUs.
DX10 architecture on both cards couldn't be more different.

On DX9, the raw speed path, 2900XT kills the 8800 Ultra. No, not a bug.

This is not the first time where 2900XT is winning on Unreal Engine 3, so obviously someone is up to something.

This post has been edited by X.E.D: Sep 6 2007, 10:27 PM
ikanayam
post Sep 6 2007, 10:34 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(X.E.D @ Sep 6 2007, 09:25 AM)
No, nVidia gave 2K Boston/Australia money and tech and marketing so that they'd optimize DX10 for nVidia GPUs.
DX10 architecture on both cards couldn't be more different.

On DX9, the raw speed path, 2900XT kills the 8800 Ultra. No, not a bug.

This is not the first time where 2900XT is winning on Unreal Engine 3, so obviously someone is up to something.
*
Nvidia also had their dx10 card ready long before ATI. This does make a difference. If the situation was reversed, you'd probably see the opposite happening. It's quite difficult to optimize for something that does not exist. The G80 was probably used as a main reference point for dx10 optimization in most dx10 engines that were being developed during the time.
HaHaNoCluE
post Sep 7 2007, 12:51 PM

Newbie
****
Senior Member
628 posts

Joined: Oct 2006


QUOTE(X.E.D @ Sep 6 2007, 05:05 PM)
Bioshock is an Nvidia "The Way It's Meant To Be Played Optimized" game.

Wait for Cat 7.9 or 8.0 when ATI can get some real optimizations.
Oh, and [T]ardOCP is heavily nvidia-slanted, just see the forums and you'll know.
Bioshock should be faster on the HD2900XT than the 8800ULTRA in DX9.
*
FanBoy detected... biggrin.gif ... neh, i dun care, i'll juz buy the best bang for the buck, be it ati, nvidia or even S3!!! if it doesn't deliver then forget about it... i dunwan to wait 6 months for a great driver coz another few month u probably hear another new gpu on the market again... icon_rolleyes.gif
Games For Windows
post Sep 7 2007, 03:22 PM

Getting Started
**
Junior Member
80 posts

Joined: Jul 2007
Nvidia always had a knack for 'optimising'... which is great. I mean after all, if they indeed do something to a particular test/bechmark WITHOUT affecting the visuals and/or quality, why not? I remember the 3DMark cheat thingy... I was like "as if anyone actually cares if the apple is really full or not full, because all you see is an entire fruit". Nvidia 'cheats' ? That is definitely not nice to say...
tracyjz
post Sep 7 2007, 04:45 PM

-Jay Jay-
******
Senior Member
1,059 posts

Joined: Jan 2007
From: Petaling jaya
how to cheat hardware?o.O
sting79
post Sep 7 2007, 06:30 PM

Getting Started
**
Junior Member
123 posts

Joined: Nov 2006



QUOTE(HaHaNoCluE @ Sep 7 2007, 12:51 PM)
FanBoy detected...  biggrin.gif  ... neh, i dun care, i'll juz buy the best bang for the buck, be it ati, nvidia or even S3!!! if it doesn't deliver then forget about it... i dunwan to wait 6 months for a great driver coz another few month u probably hear another new gpu on the market again...  icon_rolleyes.gif
*
Yeah, +1 to that. I almost wanted to get the 2900xt when it's out in May, but the bad early impressions deters me from doing so. That's 1 less customer for ATI tongue.gif

Not saying that driver improvements over time is bad, but I believe the disappointed initial launch will somehow affect the sales... this is not a pc game, when I want to get a gfx, I don't wait for new drivers to improve the card's performance (can I be sure?), I will get what is good at that time... whereas for a game, I could always wait awhile for new patches to make the game better smile.gif

Just my humble 2 cents.
X.E.D
post Sep 7 2007, 08:29 PM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


QUOTE(HaHaNoCluE @ Sep 7 2007, 12:51 PM)
FanBoy detected...  biggrin.gif  ... neh, i dun care, i'll juz buy the best bang for the buck, be it ati, nvidia or even S3!!! if it doesn't deliver then forget about it... i dunwan to wait 6 months for a great driver coz another few month u probably hear another new gpu on the market again...  icon_rolleyes.gif
*
I think what you said suits you a lot more.
The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly.
It delivers- yeah right. rolleyes.gif
Fanboy detected. laugh.gif

» Click to show Spoiler - click again to hide... «


Now, back on semi on-topic.

BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything.
It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side.

Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load. smile.gif

QUOTE(Games For Windows @ Sep 7 2007, 03:22 PM)
Nvidia always had a knack for 'optimising'... which is great. I mean after all, if they indeed do something to a particular test/bechmark WITHOUT affecting the visuals and/or quality, why not? I remember the 3DMark cheat thingy... I was like "as if anyone actually cares if the apple is really full or not full, because all you see is an entire fruit". Nvidia 'cheats' ? That is definitely not nice to say...
*
They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right?



This post has been edited by X.E.D: Sep 7 2007, 08:32 PM
SlayerXT
post Sep 7 2007, 08:30 PM

PRIDE!
*******
Senior Member
2,042 posts

Joined: Jan 2003
From: KL



QUOTE(tracyjz @ Sep 7 2007, 04:45 PM)
how to cheat hardware?o.O
*
Usually they just using software to cover hardware weaknesses.
storm88
post Sep 7 2007, 10:51 PM

~UncleSam Ready to Rolls~
*******
Senior Member
5,595 posts

Joined: Jan 2003
From: Between Hell and Heaven
erm.. u guys should noe OpenGL games rite?
Bioshock, Crysis.. i heard they are "openGL" rendered games...

get wat i mean?
ikanayam
post Sep 7 2007, 11:11 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(X.E.D @ Sep 7 2007, 07:29 AM)
I think what you said suits you a lot more.
The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly.
It delivers- yeah right.  rolleyes.gif
Fanboy detected.  laugh.gif

» Click to show Spoiler - click again to hide... «


Now, back on semi on-topic.

BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything.
It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side.

Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load.  smile.gif
They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right?
*
These days i think many games/benchmarks are being "optimized" by both companies. In the past what they are doing would have been considered cheating, but definitions seem to change over time. Also is it because of NV specific optimizations that they are losing? You seem to know this for sure. Far Cry is a TWIMTBP game (and you claim that Crytek is a major NV house), yet previous gen AMD cards did much better than NV. In fact, the X1950 does so well in it, it's on par with the R600 in that game. Doesn't that itself point to something? Perhaps a problem. Perhaps their drivers are still bad. There are some nice gains coming in their next few drivers.


QUOTE(storm88 @ Sep 7 2007, 09:51 AM)
erm.. u guys should noe OpenGL games rite?
Bioshock, Crysis.. i heard they are "openGL" rendered games...

get wat i mean?
*
They are dx titles, not OpenGL. You "heard" wrong.
wodenus
post Sep 7 2007, 11:14 PM

Tree Octopus
********
All Stars
14,990 posts

Joined: Jan 2003
Well... all I know is my 8500GT cost Rm270 and plays the Bioshock demo well enough for me not to care. I'd like to see a Rm270 ATi card that can do just as well smile.gif
t3chn0m4nc3r
post Sep 7 2007, 11:25 PM

Teh Necron Lord
*******
Senior Member
4,139 posts

Joined: Sep 2006
From: Internet


QUOTE(X.E.D @ Sep 7 2007, 09:29 PM)
I think what you said suits you a lot more.
The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly.
It delivers- yeah right.  rolleyes.gif
Fanboy detected.  laugh.gif

» Click to show Spoiler - click again to hide... «


Now, back on semi on-topic.

BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything.
It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side.

Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load.  smile.gif
They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right?
*
i totally agree... ATI is also better in Company of Heroes on DX9... thumbup.gif
not into DX10 yet... i wonder if my 7600GS can perform better than my 2600XT in Bioshock... whistling.gif

*BTW... paying game developers to "optimize" the game engine for the payers' ain't wat i call deliver... it's called bribary... imagine if i can afford to pay the whole malaysia PDRM and the court juz to sue fanboys for being fanboys... tat's juz isn't fair... icon_rolleyes.gif

This post has been edited by t3chn0m4nc3r: Sep 7 2007, 11:28 PM
ikanayam
post Sep 7 2007, 11:30 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(wodenus @ Sep 7 2007, 10:14 AM)
Well... all I know is my 8500GT cost Rm270 and plays the Bioshock demo well enough for me not to care. I'd like to see a Rm270 ATi card that can do just as well smile.gif
*
Haha. At what resolution/settings? The 2400xt should be priced similarly and should be about as good at the same setting. In any case, at this level it doesn't really matter because you have to turn pretty much everything off to get a playable framerate.
wodenus
post Sep 8 2007, 03:01 AM

Tree Octopus
********
All Stars
14,990 posts

Joined: Jan 2003
QUOTE(ikanayam @ Sep 7 2007, 11:30 PM)
Haha. At what resolution/settings? The 2400xt should be priced similarly and should be about as good at the same setting. In any case, at this level it doesn't really matter because you have to turn pretty much everything off to get a playable framerate.
*
1280x1024 (monitor native resolution) -- all settings on high with Vsync on smile.gif want me to post screenshots ? or a movie ? smile.gif
ikanayam
post Sep 8 2007, 03:30 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(wodenus @ Sep 7 2007, 02:01 PM)
1280x1024 (monitor native resolution) -- all settings on high with Vsync on smile.gif want me to post screenshots ? or a movie ? smile.gif
*
That's quite alright. What kind of framerates are you getting?
salimbest83
post Sep 8 2007, 04:33 AM

♥PMS on certain day♥
*******
Senior Member
8,647 posts

Joined: Feb 2006
From: Jelutong Penang



2400Xt @1280*1024...
i think u fell below 30FPS..
not playable for action or racing game
storm88
post Sep 8 2007, 09:24 AM

~UncleSam Ready to Rolls~
*******
Senior Member
5,595 posts

Joined: Jan 2003
From: Between Hell and Heaven
QUOTE(ikanayam @ Sep 8 2007, 12:11 AM)
They are dx titles, not OpenGL. You "heard" wrong.
*
en. mayb i used wrong def tongue.gif paiseh
btw, what i mean that, these title are "GAMES ON BY NVIDIA"
they are sponsored by nvidia and, the developer will design the game more optimized base on NV cards
sempronic
post Sep 8 2007, 12:29 PM

Enthusiast
*****
Senior Member
986 posts

Joined: Jul 2006
From: :: infront the steering ::


yeah...
Nvidia the image quality is lower then the Ati...

but for the performance...Nvidia wins...

that what i know and experiance ar.....

anyway....nvidia....i alwiz be with u....
hahaha tongue.gif
HaHaNoCluE
post Sep 8 2007, 02:37 PM

Newbie
****
Senior Member
628 posts

Joined: Oct 2006


QUOTE(X.E.D @ Sep 7 2007, 08:29 PM)
I think what you said suits you a lot more.
The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly.
It delivers- yeah right.  rolleyes.gif
Fanboy detected.  laugh.gif

Now, back on semi on-topic.

BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything.
It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side.

Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load.  smile.gif
They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right?
*
i dunno but maybe i am a fanboy... i'm a fanboy when i pay, i'll get what i want... not after wating for few xtra months... i do have a few ATI gpu too for ur info... i'm not saying ATI isn't good but just 2900xt didn't deliver for what most ppl expect after waiting for it to launch for 6 months... hell, the older 1950xtx works on par with the 2900xt in dx9 games when it was launch... o ya, not every1 play bioshock or call of juarez... n i'm still in XP... not vista... i've gone from 7950gx2 to 8800gts isn't coz 8800gts perform much better than the gx2 but it's juz my 975 mobo ain't frenly with it... i'm even getting a 2600xt for my htpc with abit f-190h mobo coz i wanted them in red color... as simple as dat, n 2600xt has the simpler hdmi jack.... so am i a fanboy of asking for performance/price ratio??? yes i am...

gpu maker sponsor games??? yeah, what's wrong with it??? well, if they dun sponsor, we might dun even have these games around... try no 1 sponsor F1 race... but taking the fully sponsored games as a benchmark to compare to other gpu makers card isn't so nice... over the time, better n greater cards r produced, so arguing wat certain gpu maker might improve in the next 9 to 12 months for current card, isn't wat users wanna think much... by dat time, we might oredi changed to newer card again... doh.gif

o, btw, anyone playing latest colin mcrae rally game??? wat gpu n wat fps u r getting when everything set to max with 1680 x 1050 resolution??? sweat.gif
X.E.D
post Sep 8 2007, 05:19 PM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


QUOTE(wodenus @ Sep 7 2007, 11:14 PM)
Well... all I know is my 8500GT cost Rm270 and plays the Bioshock demo well enough for me not to care. I'd like to see a Rm270 ATi card that can do just as well smile.gif
*
2600 PRO is only RM30 more, runs circles around the 8500GT.
(Check all the benchmarks, they support the logic- not just in Bioshock, but other games too. 2600XT can beat 8600GT, 2600Pro is 3/4 of a XT, 8500GT is less than 1/2 of a 8600GT. Of course you won't care, since you already have the card. And it runs! biggrin.gif)

I do pity nVidia somehow- DX9, where 90% of the market is, can't be "crippled" for any other GPU vendor by non optimmal (not non-optimized) code.
DX9 was pretty much in spec as both vendors know what code they were facing against (well, besides Geforce FX- nvidia always right? RIGHT?) so they did have the balanced solution for whatever was next.

(@ikanayam- Far Cry was shader power bound, IS fillrate bound. X1900s were shader monsters.)

R600 and G80 instead are very different in the aptitude in which they handle certain code, and most code that nV "sponsors" do not take much use of R600's skinny stream units, leaving the fatty ones to do all the work. Lost Planet's was brutal. Optimizations in general? Sure. Deeper level un-crippling? Sure, IF AMD was the sponsor of BioShock. Now that the game's out, the code's done, IMHO little can be done to get the HD2900XT run faster than the GTX- at DX10, like it did in DX9.


But keep in mind that nV can't sponsor every game. wink.gif

This post has been edited by X.E.D: Sep 8 2007, 05:30 PM
wodenus
post Sep 8 2007, 05:38 PM

Tree Octopus
********
All Stars
14,990 posts

Joined: Jan 2003
QUOTE(ikanayam @ Sep 8 2007, 03:30 AM)
That's quite alright. What kind of framerates are you getting?
*
I don't know smile.gif it's smooth enough to play and isn't a slideshow or totally annoying. The card's response rate is faster than mine smile.gif I think in the heat of the action, you really don't count framerates. The human eye sees fluid movement as anything over 30fps, so I guess the framerate is somewhere around there. It's good enough to admire the graphics and be fast enough to play. Bioshock isn't exactly a run and gun, the demo has this guy who tells you what to do all the time, so you're basically just listening and watching and shooting in place.


Added on September 8, 2007, 5:54 pm
QUOTE(X.E.D @ Sep 8 2007, 05:19 PM)
2600 PRO is only RM30 more, runs circles around the 8500GT.
(Check all the benchmarks, they support the logic- not just in Bioshock, but other games too.


I don't play benchmarks smile.gif

QUOTE(X.E.D @ Sep 8 2007, 05:19 PM)
2600XT can beat 8600GT, 2600Pro is 3/4 of a XT, 8500GT is less than 1/2 of a 8600GT. Of course you won't care, since you already have the card. And it runs! biggrin.gif)


Exactly.. it's Rm270 and it plays the latest games reasonably well. In fact I'm quite surprised.. I mean it's Rm270, and it's not like 0.1fps with all the settings on high and vsync on.. and those were default too. Bioshock isn't that great a game though, they limit your freedom a lot. You have no choice but to follow this Atlas guy. It's got a horror-house quality about it, they push you through the game like it's an interactive guided tour.

QUOTE(X.E.D @ Sep 8 2007, 05:19 PM)
I do pity nVidia somehow- DX9, where 90% of the market is


Is, not will be...

QUOTE(X.E.D @ Sep 8 2007, 05:19 PM)
(@ikanayam- Far Cry was shader power bound, IS fillrate bound. X1900s were shader monsters.)


Wow Far Cry.. that's so ancient they're giving it away free smile.gif

QUOTE(X.E.D @ Sep 8 2007, 05:19 PM)
But keep in mind that nV can't sponsor every game. wink.gif


Hey it's not like NV is the only one in that game (ha !) -- there are ATi optimizations in 2moons... pity it's boring as all heck tongue.gif

PS. Totally loved World in Conflict. It's quite fun. That's a much better game than Bioshock (at least for me, I'm more of a strategy gamer that a FPS-er smile.gif )

This post has been edited by wodenus: Sep 9 2007, 11:30 AM
X.E.D
post Sep 8 2007, 06:30 PM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


@Wodenenus

You ASKED for something equivalent, so I kindly came out with one that runs games pretty well too- even the nV sponsored one laugh.gif
wodenus
post Sep 8 2007, 09:32 PM

Tree Octopus
********
All Stars
14,990 posts

Joined: Jan 2003
QUOTE(X.E.D @ Sep 8 2007, 06:30 PM)
@Wodenenus

You ASKED for something equivalent, so I kindly came out with one that runs games pretty well too- even the nV sponsored one laugh.gif
*
Heh... maybe it does.. but it's not Rm270 smile.gif even if it's a Rm30 difference, that's a really good dinner smile.gif
ikanayam
post Sep 9 2007, 12:40 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

Haha apparently you do not know wodenus.

QUOTE(X.E.D @ Sep 8 2007, 04:19 AM)
DX9 was pretty much in spec as both vendors know what code they were facing against (well, besides Geforce FX- nvidia always right? RIGHT?) so they did have the balanced solution for whatever was next.
*
Please elaborate. How is this different from now? How did they know then and suddenly not know now? It's not like dx10 spec was suddenly pushed upon them. It's not like the dx10 games were developed in 3 weeks. So what is your point?


QUOTE(X.E.D @ Sep 8 2007, 04:19 AM)
(@ikanayam- Far Cry was shader power bound, IS fillrate bound. X1900s were shader monsters.)
*
You completely missed the point. R600 is vastly superior in ALL those measures (at least on paper). So why is it losing to a x1950? If you don't see that as a sign of a problem somewhere (drivers and/or hardware), then i'm not sure what to say...


QUOTE(X.E.D @ Sep 8 2007, 04:19 AM)
R600 and G80 instead are very different in the aptitude in which they handle certain code, and most code that nV "sponsors" do not take much use of R600's skinny stream units, leaving the fatty ones to do all the work. Lost Planet's was brutal. Optimizations in general? Sure. Deeper level un-crippling? Sure, IF AMD was the sponsor of BioShock. Now that the game's out, the code's done, IMHO little can be done to get the HD2900XT run faster than the GTX- at DX10, like it did in DX9.
But keep in mind that nV can't sponsor every game. wink.gif
*
Most intriguing. Can you give an example of what they do that takes uses mostly the "fat" units and does not use the "thin" units? And how is G80 different in handling such cases? Be as detailed as possible.
SlayerXT
post Sep 9 2007, 02:52 PM

PRIDE!
*******
Senior Member
2,042 posts

Joined: Jan 2003
From: KL



Maybe u should explain to him instead as an engineer.

Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0281sec    0.44    5 queries    GZIP Disabled
Time is now: 21st December 2025 - 06:56 AM