Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed

Outline · [ Standard ] · Linear+

 Nvidia Cheats~

views
     
ikanayam
post Sep 6 2007, 10:34 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(X.E.D @ Sep 6 2007, 09:25 AM)
No, nVidia gave 2K Boston/Australia money and tech and marketing so that they'd optimize DX10 for nVidia GPUs.
DX10 architecture on both cards couldn't be more different.

On DX9, the raw speed path, 2900XT kills the 8800 Ultra. No, not a bug.

This is not the first time where 2900XT is winning on Unreal Engine 3, so obviously someone is up to something.
*
Nvidia also had their dx10 card ready long before ATI. This does make a difference. If the situation was reversed, you'd probably see the opposite happening. It's quite difficult to optimize for something that does not exist. The G80 was probably used as a main reference point for dx10 optimization in most dx10 engines that were being developed during the time.
HaHaNoCluE
post Sep 7 2007, 12:51 PM

Newbie
****
Senior Member
628 posts

Joined: Oct 2006


QUOTE(X.E.D @ Sep 6 2007, 05:05 PM)
Bioshock is an Nvidia "The Way It's Meant To Be Played Optimized" game.

Wait for Cat 7.9 or 8.0 when ATI can get some real optimizations.
Oh, and [T]ardOCP is heavily nvidia-slanted, just see the forums and you'll know.
Bioshock should be faster on the HD2900XT than the 8800ULTRA in DX9.
*
FanBoy detected... biggrin.gif ... neh, i dun care, i'll juz buy the best bang for the buck, be it ati, nvidia or even S3!!! if it doesn't deliver then forget about it... i dunwan to wait 6 months for a great driver coz another few month u probably hear another new gpu on the market again... icon_rolleyes.gif
Games For Windows
post Sep 7 2007, 03:22 PM

Getting Started
**
Junior Member
80 posts

Joined: Jul 2007
Nvidia always had a knack for 'optimising'... which is great. I mean after all, if they indeed do something to a particular test/bechmark WITHOUT affecting the visuals and/or quality, why not? I remember the 3DMark cheat thingy... I was like "as if anyone actually cares if the apple is really full or not full, because all you see is an entire fruit". Nvidia 'cheats' ? That is definitely not nice to say...
tracyjz
post Sep 7 2007, 04:45 PM

-Jay Jay-
******
Senior Member
1,059 posts

Joined: Jan 2007
From: Petaling jaya
how to cheat hardware?o.O
sting79
post Sep 7 2007, 06:30 PM

Getting Started
**
Junior Member
123 posts

Joined: Nov 2006



QUOTE(HaHaNoCluE @ Sep 7 2007, 12:51 PM)
FanBoy detected...  biggrin.gif  ... neh, i dun care, i'll juz buy the best bang for the buck, be it ati, nvidia or even S3!!! if it doesn't deliver then forget about it... i dunwan to wait 6 months for a great driver coz another few month u probably hear another new gpu on the market again...  icon_rolleyes.gif
*
Yeah, +1 to that. I almost wanted to get the 2900xt when it's out in May, but the bad early impressions deters me from doing so. That's 1 less customer for ATI tongue.gif

Not saying that driver improvements over time is bad, but I believe the disappointed initial launch will somehow affect the sales... this is not a pc game, when I want to get a gfx, I don't wait for new drivers to improve the card's performance (can I be sure?), I will get what is good at that time... whereas for a game, I could always wait awhile for new patches to make the game better smile.gif

Just my humble 2 cents.
X.E.D
post Sep 7 2007, 08:29 PM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


QUOTE(HaHaNoCluE @ Sep 7 2007, 12:51 PM)
FanBoy detected...  biggrin.gif  ... neh, i dun care, i'll juz buy the best bang for the buck, be it ati, nvidia or even S3!!! if it doesn't deliver then forget about it... i dunwan to wait 6 months for a great driver coz another few month u probably hear another new gpu on the market again...  icon_rolleyes.gif
*
I think what you said suits you a lot more.
The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly.
It delivers- yeah right. rolleyes.gif
Fanboy detected. laugh.gif

» Click to show Spoiler - click again to hide... «


Now, back on semi on-topic.

BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything.
It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side.

Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load. smile.gif

QUOTE(Games For Windows @ Sep 7 2007, 03:22 PM)
Nvidia always had a knack for 'optimising'... which is great. I mean after all, if they indeed do something to a particular test/bechmark WITHOUT affecting the visuals and/or quality, why not? I remember the 3DMark cheat thingy... I was like "as if anyone actually cares if the apple is really full or not full, because all you see is an entire fruit". Nvidia 'cheats' ? That is definitely not nice to say...
*
They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right?



This post has been edited by X.E.D: Sep 7 2007, 08:32 PM
SlayerXT
post Sep 7 2007, 08:30 PM

PRIDE!
*******
Senior Member
2,042 posts

Joined: Jan 2003
From: KL



QUOTE(tracyjz @ Sep 7 2007, 04:45 PM)
how to cheat hardware?o.O
*
Usually they just using software to cover hardware weaknesses.
storm88
post Sep 7 2007, 10:51 PM

~UncleSam Ready to Rolls~
*******
Senior Member
5,595 posts

Joined: Jan 2003
From: Between Hell and Heaven
erm.. u guys should noe OpenGL games rite?
Bioshock, Crysis.. i heard they are "openGL" rendered games...

get wat i mean?
ikanayam
post Sep 7 2007, 11:11 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(X.E.D @ Sep 7 2007, 07:29 AM)
I think what you said suits you a lot more.
The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly.
It delivers- yeah right.  rolleyes.gif
Fanboy detected.  laugh.gif

» Click to show Spoiler - click again to hide... «


Now, back on semi on-topic.

BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything.
It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side.

Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load.  smile.gif
They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right?
*
These days i think many games/benchmarks are being "optimized" by both companies. In the past what they are doing would have been considered cheating, but definitions seem to change over time. Also is it because of NV specific optimizations that they are losing? You seem to know this for sure. Far Cry is a TWIMTBP game (and you claim that Crytek is a major NV house), yet previous gen AMD cards did much better than NV. In fact, the X1950 does so well in it, it's on par with the R600 in that game. Doesn't that itself point to something? Perhaps a problem. Perhaps their drivers are still bad. There are some nice gains coming in their next few drivers.


QUOTE(storm88 @ Sep 7 2007, 09:51 AM)
erm.. u guys should noe OpenGL games rite?
Bioshock, Crysis.. i heard they are "openGL" rendered games...

get wat i mean?
*
They are dx titles, not OpenGL. You "heard" wrong.
wodenus
post Sep 7 2007, 11:14 PM

Tree Octopus
********
All Stars
14,990 posts

Joined: Jan 2003
Well... all I know is my 8500GT cost Rm270 and plays the Bioshock demo well enough for me not to care. I'd like to see a Rm270 ATi card that can do just as well smile.gif
t3chn0m4nc3r
post Sep 7 2007, 11:25 PM

Teh Necron Lord
*******
Senior Member
4,139 posts

Joined: Sep 2006
From: Internet


QUOTE(X.E.D @ Sep 7 2007, 09:29 PM)
I think what you said suits you a lot more.
The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly.
It delivers- yeah right.  rolleyes.gif
Fanboy detected.  laugh.gif

» Click to show Spoiler - click again to hide... «


Now, back on semi on-topic.

BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything.
It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side.

Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load.  smile.gif
They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right?
*
i totally agree... ATI is also better in Company of Heroes on DX9... thumbup.gif
not into DX10 yet... i wonder if my 7600GS can perform better than my 2600XT in Bioshock... whistling.gif

*BTW... paying game developers to "optimize" the game engine for the payers' ain't wat i call deliver... it's called bribary... imagine if i can afford to pay the whole malaysia PDRM and the court juz to sue fanboys for being fanboys... tat's juz isn't fair... icon_rolleyes.gif

This post has been edited by t3chn0m4nc3r: Sep 7 2007, 11:28 PM
ikanayam
post Sep 7 2007, 11:30 PM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(wodenus @ Sep 7 2007, 10:14 AM)
Well... all I know is my 8500GT cost Rm270 and plays the Bioshock demo well enough for me not to care. I'd like to see a Rm270 ATi card that can do just as well smile.gif
*
Haha. At what resolution/settings? The 2400xt should be priced similarly and should be about as good at the same setting. In any case, at this level it doesn't really matter because you have to turn pretty much everything off to get a playable framerate.
wodenus
post Sep 8 2007, 03:01 AM

Tree Octopus
********
All Stars
14,990 posts

Joined: Jan 2003
QUOTE(ikanayam @ Sep 7 2007, 11:30 PM)
Haha. At what resolution/settings? The 2400xt should be priced similarly and should be about as good at the same setting. In any case, at this level it doesn't really matter because you have to turn pretty much everything off to get a playable framerate.
*
1280x1024 (monitor native resolution) -- all settings on high with Vsync on smile.gif want me to post screenshots ? or a movie ? smile.gif
ikanayam
post Sep 8 2007, 03:30 AM

there are no pacts between fish and men
********
Senior Member
10,544 posts

Joined: Jan 2003
From: GMT +8:00

QUOTE(wodenus @ Sep 7 2007, 02:01 PM)
1280x1024 (monitor native resolution) -- all settings on high with Vsync on smile.gif want me to post screenshots ? or a movie ? smile.gif
*
That's quite alright. What kind of framerates are you getting?
salimbest83
post Sep 8 2007, 04:33 AM

♥PMS on certain day♥
*******
Senior Member
8,647 posts

Joined: Feb 2006
From: Jelutong Penang



2400Xt @1280*1024...
i think u fell below 30FPS..
not playable for action or racing game
storm88
post Sep 8 2007, 09:24 AM

~UncleSam Ready to Rolls~
*******
Senior Member
5,595 posts

Joined: Jan 2003
From: Between Hell and Heaven
QUOTE(ikanayam @ Sep 8 2007, 12:11 AM)
They are dx titles, not OpenGL. You "heard" wrong.
*
en. mayb i used wrong def tongue.gif paiseh
btw, what i mean that, these title are "GAMES ON BY NVIDIA"
they are sponsored by nvidia and, the developer will design the game more optimized base on NV cards
sempronic
post Sep 8 2007, 12:29 PM

Enthusiast
*****
Senior Member
986 posts

Joined: Jul 2006
From: :: infront the steering ::


yeah...
Nvidia the image quality is lower then the Ati...

but for the performance...Nvidia wins...

that what i know and experiance ar.....

anyway....nvidia....i alwiz be with u....
hahaha tongue.gif
HaHaNoCluE
post Sep 8 2007, 02:37 PM

Newbie
****
Senior Member
628 posts

Joined: Oct 2006


QUOTE(X.E.D @ Sep 7 2007, 08:29 PM)
I think what you said suits you a lot more.
The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly.
It delivers- yeah right.  rolleyes.gif
Fanboy detected.  laugh.gif

Now, back on semi on-topic.

BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything.
It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side.

Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load.  smile.gif
They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right?
*
i dunno but maybe i am a fanboy... i'm a fanboy when i pay, i'll get what i want... not after wating for few xtra months... i do have a few ATI gpu too for ur info... i'm not saying ATI isn't good but just 2900xt didn't deliver for what most ppl expect after waiting for it to launch for 6 months... hell, the older 1950xtx works on par with the 2900xt in dx9 games when it was launch... o ya, not every1 play bioshock or call of juarez... n i'm still in XP... not vista... i've gone from 7950gx2 to 8800gts isn't coz 8800gts perform much better than the gx2 but it's juz my 975 mobo ain't frenly with it... i'm even getting a 2600xt for my htpc with abit f-190h mobo coz i wanted them in red color... as simple as dat, n 2600xt has the simpler hdmi jack.... so am i a fanboy of asking for performance/price ratio??? yes i am...

gpu maker sponsor games??? yeah, what's wrong with it??? well, if they dun sponsor, we might dun even have these games around... try no 1 sponsor F1 race... but taking the fully sponsored games as a benchmark to compare to other gpu makers card isn't so nice... over the time, better n greater cards r produced, so arguing wat certain gpu maker might improve in the next 9 to 12 months for current card, isn't wat users wanna think much... by dat time, we might oredi changed to newer card again... doh.gif

o, btw, anyone playing latest colin mcrae rally game??? wat gpu n wat fps u r getting when everything set to max with 1680 x 1050 resolution??? sweat.gif
X.E.D
post Sep 8 2007, 05:19 PM

curmudgeonosorus emeritus
******
Senior Member
1,955 posts

Joined: Jan 2006
From: Llanfair­pwllgwyngyll­gogery­ch


QUOTE(wodenus @ Sep 7 2007, 11:14 PM)
Well... all I know is my 8500GT cost Rm270 and plays the Bioshock demo well enough for me not to care. I'd like to see a Rm270 ATi card that can do just as well smile.gif
*
2600 PRO is only RM30 more, runs circles around the 8500GT.
(Check all the benchmarks, they support the logic- not just in Bioshock, but other games too. 2600XT can beat 8600GT, 2600Pro is 3/4 of a XT, 8500GT is less than 1/2 of a 8600GT. Of course you won't care, since you already have the card. And it runs! biggrin.gif)

I do pity nVidia somehow- DX9, where 90% of the market is, can't be "crippled" for any other GPU vendor by non optimmal (not non-optimized) code.
DX9 was pretty much in spec as both vendors know what code they were facing against (well, besides Geforce FX- nvidia always right? RIGHT?) so they did have the balanced solution for whatever was next.

(@ikanayam- Far Cry was shader power bound, IS fillrate bound. X1900s were shader monsters.)

R600 and G80 instead are very different in the aptitude in which they handle certain code, and most code that nV "sponsors" do not take much use of R600's skinny stream units, leaving the fatty ones to do all the work. Lost Planet's was brutal. Optimizations in general? Sure. Deeper level un-crippling? Sure, IF AMD was the sponsor of BioShock. Now that the game's out, the code's done, IMHO little can be done to get the HD2900XT run faster than the GTX- at DX10, like it did in DX9.


But keep in mind that nV can't sponsor every game. wink.gif

This post has been edited by X.E.D: Sep 8 2007, 05:30 PM
wodenus
post Sep 8 2007, 05:38 PM

Tree Octopus
********
All Stars
14,990 posts

Joined: Jan 2003
QUOTE(ikanayam @ Sep 8 2007, 03:30 AM)
That's quite alright. What kind of framerates are you getting?
*
I don't know smile.gif it's smooth enough to play and isn't a slideshow or totally annoying. The card's response rate is faster than mine smile.gif I think in the heat of the action, you really don't count framerates. The human eye sees fluid movement as anything over 30fps, so I guess the framerate is somewhere around there. It's good enough to admire the graphics and be fast enough to play. Bioshock isn't exactly a run and gun, the demo has this guy who tells you what to do all the time, so you're basically just listening and watching and shooting in place.


Added on September 8, 2007, 5:54 pm
QUOTE(X.E.D @ Sep 8 2007, 05:19 PM)
2600 PRO is only RM30 more, runs circles around the 8500GT.
(Check all the benchmarks, they support the logic- not just in Bioshock, but other games too.


I don't play benchmarks smile.gif

QUOTE(X.E.D @ Sep 8 2007, 05:19 PM)
2600XT can beat 8600GT, 2600Pro is 3/4 of a XT, 8500GT is less than 1/2 of a 8600GT. Of course you won't care, since you already have the card. And it runs! biggrin.gif)


Exactly.. it's Rm270 and it plays the latest games reasonably well. In fact I'm quite surprised.. I mean it's Rm270, and it's not like 0.1fps with all the settings on high and vsync on.. and those were default too. Bioshock isn't that great a game though, they limit your freedom a lot. You have no choice but to follow this Atlas guy. It's got a horror-house quality about it, they push you through the game like it's an interactive guided tour.

QUOTE(X.E.D @ Sep 8 2007, 05:19 PM)
I do pity nVidia somehow- DX9, where 90% of the market is


Is, not will be...

QUOTE(X.E.D @ Sep 8 2007, 05:19 PM)
(@ikanayam- Far Cry was shader power bound, IS fillrate bound. X1900s were shader monsters.)


Wow Far Cry.. that's so ancient they're giving it away free smile.gif

QUOTE(X.E.D @ Sep 8 2007, 05:19 PM)
But keep in mind that nV can't sponsor every game. wink.gif


Hey it's not like NV is the only one in that game (ha !) -- there are ATi optimizations in 2moons... pity it's boring as all heck tongue.gif

PS. Totally loved World in Conflict. It's quite fun. That's a much better game than Bioshock (at least for me, I'm more of a strategy gamer that a FPS-er smile.gif )

This post has been edited by wodenus: Sep 9 2007, 11:30 AM

3 Pages < 1 2 3 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0217sec    0.43    5 queries    GZIP Disabled
Time is now: 21st December 2025 - 03:21 AM