http://www.dailytech.com/AMD+Alleges+NVIDI...article8608.htm
How you guys think about this news from Tech daily? Turstable?
Nvidia Cheats~
Nvidia Cheats~
|
|
Sep 5 2007, 10:54 AM, updated 19y ago
Show posts by this member only | Post
#1
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,487 posts Joined: Dec 2006 From: Ja Bee |
http://www.dailytech.com/AMD+Alleges+NVIDI...article8608.htm
How you guys think about this news from Tech daily? Turstable? |
|
|
|
|
|
Sep 5 2007, 12:35 PM
Show posts by this member only | Post
#2
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
7,173 posts Joined: Jan 2003 From: PCH |
alaa... easyla... nVIDIA's IQ is not good, ATi's IQ is way much better, no seriously...
|
|
|
Sep 5 2007, 12:39 PM
Show posts by this member only | Post
#3
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,042 posts Joined: Jan 2003 From: KL |
That was for video encoding right? Not really care, im always into gaming...
|
|
|
Sep 5 2007, 12:42 PM
Show posts by this member only | Post
#4
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
7,173 posts Joined: Jan 2003 From: PCH |
|
|
|
Sep 5 2007, 12:55 PM
Show posts by this member only | Post
#5
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,091 posts Joined: Oct 2005 |
Could be Nvidia releases drivers which draws faster while ATI goes for the drivers which draws more accurately.
If so, then Nvidia will always score higher scores in benchmarks with not-too-visible graphics quality reduction. QUOTE "As we openly told reviewers, using aggressive noise reduction settings may cause ghosting depending on the content played so we recommend using moderate settings. We also recommend the improved 163.44 drivers release a few weeks ago which reduce this effect," Allen said. Maybe this shows what Nvidia was really up to with its improved drivers. This post has been edited by ff7yta: Sep 5 2007, 12:56 PM |
|
|
Sep 5 2007, 04:01 PM
Show posts by this member only | Post
#6
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,966 posts Joined: Jan 2003 From: No longer hanging by a NUS |
ATI -> Quack.exe
NV -> 3DMark The pot calling the kettle black. |
|
|
|
|
|
Sep 5 2007, 04:55 PM
Show posts by this member only | Post
#7
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,942 posts Joined: Nov 2005 From: Penang |
sounds like a bug on Nvidia side, and their latest driver's just trying to mask the problem. But then again, not that I would even care, since I only use the lowly 6600GT
|
|
|
Sep 5 2007, 05:16 PM
Show posts by this member only | Post
#8
|
![]() ![]() ![]() ![]()
Senior Member
628 posts Joined: Oct 2006 |
but recently the 2900xt does score very well in 3d mark but doesn't perform very well in games...
|
|
|
Sep 5 2007, 06:21 PM
Show posts by this member only | Post
#9
|
![]() ![]() ![]() ![]() ![]()
Senior Member
863 posts Joined: Mar 2007 From: Tanjung Segitiga Masonic Lodge |
|
|
|
Sep 5 2007, 08:20 PM
|
|
Elite
4,603 posts Joined: Jan 2003 From: PJ |
QUOTE(sniper69 @ Sep 5 2007, 12:42 PM) yeap indeed... Used to be, where the AF quality of Nvidia cards were inferior to ATI's. That all changed with the 7-series iianm. In terms of 3D rendering, they're both pretty much equal now.but in both (gaming & video rendering), ATi's always better, i don't know why, i don't know the reason... but it was there since Radeon 8 series |
|
|
Sep 5 2007, 08:44 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,769 posts Joined: Jan 2003 From: Malaysia |
QUOTE(Eoma @ Sep 5 2007, 08:20 PM) Used to be, where the AF quality of Nvidia cards were inferior to ATI's. That all changed with the 7-series iianm. In terms of 3D rendering, they're both pretty much equal now. The AF quality for both NV and ATi had only become equal with the 8-series . Alot had complained 7-series quality , which even worse than 6-series . |
|
|
Sep 5 2007, 08:51 PM
|
![]() ![]() ![]() ![]()
Senior Member
628 posts Joined: Oct 2006 |
tried the 2600xt ddr4, it's a complete suicide while gaming with AA + AF on... it's very very low FPS n the image quality is really annoying... some 2900xt users reported same issues with their gpu... whenever the AA + AF is on, the card is dying... (i was impressed with their 3dmark score but given up on the idea to change to 2900xt after some comments on gaming)... i dunwan to play games without AA + AF on...
|
|
|
Sep 5 2007, 10:52 PM
|
|
Elite
4,603 posts Joined: Jan 2003 From: PJ |
|
|
|
|
|
|
Sep 5 2007, 11:48 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
10,544 posts Joined: Jan 2003 From: GMT +8:00 |
|
|
|
Sep 5 2007, 11:53 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,051 posts Joined: Feb 2005 From: Somewhere Out There |
yea, they did a good job there, and its about the ONLY thing where nvidia card kicks 2900XT's ass. Other than AA/AF issue, judging from other factors i dont see why 2900XT should be left behind at all. And the difference between 2900XT and 8800GTX is HUGE.
|
|
|
Sep 6 2007, 12:05 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
10,544 posts Joined: Jan 2003 From: GMT +8:00 |
QUOTE(AoiNatsume @ Sep 5 2007, 10:53 AM) yea, they did a good job there, and its about the ONLY thing where nvidia card kicks 2900XT's ass. Other than AA/AF issue, judging from other factors i dont see why 2900XT should be left behind at all. And the difference between 2900XT and 8800GTX is HUGE. Well as far as i can see, NV did such a good job and/or ATI did such a bad one that they had to price their top end card one step lower than NV's top end card. And that says a lot. Clearly this was NOT what they had in mind for R600. G80 seems to be the better chip in many ways. Why shouldn't the 2900XT be left behind? Because its paper specs look impressive? |
|
|
Sep 6 2007, 01:42 AM
|
![]() ![]() ![]()
Junior Member
387 posts Joined: May 2007 From: Pahang,Sentul,MMU Malacca |
I'm not sure whether this is relevant to this topic or not.
http://enthusiast.hardocp.com/article.html...W50aHVzaWFzdA== |
|
|
Sep 6 2007, 05:05 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,955 posts Joined: Jan 2006 From: LlanfairÂpwllgwyngyllÂgogeryÂch |
QUOTE(The Scent LYN @ Sep 6 2007, 01:42 AM) I'm not sure whether this is relevant to this topic or not. Bioshock is an Nvidia "The Way It's Meant To Be http://enthusiast.hardocp.com/article.html...W50aHVzaWFzdA== Wait for Cat 7.9 or 8.0 when ATI can get some real optimizations. Oh, and [T]ardOCP is heavily nvidia-slanted, just see the forums and you'll know. Bioshock should be faster on the HD2900XT than the 8800ULTRA in DX9. |
|
|
Sep 6 2007, 09:15 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,042 posts Joined: Jan 2003 From: KL |
Any card is meant to be played as long as it can provide more than 30fps continuously.
|
|
|
Sep 6 2007, 10:25 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,955 posts Joined: Jan 2006 From: LlanfairÂpwllgwyngyllÂgogeryÂch |
QUOTE(§layerXT @ Sep 6 2007, 09:15 PM) No, nVidia gave 2K Boston/Australia money and tech and marketing so that they'd optimize DX10 for nVidia GPUs.DX10 architecture on both cards couldn't be more different. On DX9, the raw speed path, 2900XT kills the 8800 Ultra. No, not a bug. This is not the first time where 2900XT is winning on Unreal Engine 3, so obviously someone is up to something. This post has been edited by X.E.D: Sep 6 2007, 10:27 PM |
|
|
Sep 6 2007, 10:34 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
10,544 posts Joined: Jan 2003 From: GMT +8:00 |
QUOTE(X.E.D @ Sep 6 2007, 09:25 AM) No, nVidia gave 2K Boston/Australia money and tech and marketing so that they'd optimize DX10 for nVidia GPUs. Nvidia also had their dx10 card ready long before ATI. This does make a difference. If the situation was reversed, you'd probably see the opposite happening. It's quite difficult to optimize for something that does not exist. The G80 was probably used as a main reference point for dx10 optimization in most dx10 engines that were being developed during the time.DX10 architecture on both cards couldn't be more different. On DX9, the raw speed path, 2900XT kills the 8800 Ultra. No, not a bug. This is not the first time where 2900XT is winning on Unreal Engine 3, so obviously someone is up to something. |
|
|
Sep 7 2007, 12:51 PM
|
![]() ![]() ![]() ![]()
Senior Member
628 posts Joined: Oct 2006 |
QUOTE(X.E.D @ Sep 6 2007, 05:05 PM) Bioshock is an Nvidia "The Way It's Meant To Be FanBoy detected... Wait for Cat 7.9 or 8.0 when ATI can get some real optimizations. Oh, and [T]ardOCP is heavily nvidia-slanted, just see the forums and you'll know. Bioshock should be faster on the HD2900XT than the 8800ULTRA in DX9. |
|
|
Sep 7 2007, 03:22 PM
|
![]() ![]()
Junior Member
80 posts Joined: Jul 2007 |
Nvidia always had a knack for 'optimising'... which is great. I mean after all, if they indeed do something to a particular test/bechmark WITHOUT affecting the visuals and/or quality, why not? I remember the 3DMark cheat thingy... I was like "as if anyone actually cares if the apple is really full or not full, because all you see is an entire fruit". Nvidia 'cheats' ? That is definitely not nice to say...
|
|
|
Sep 7 2007, 04:45 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,059 posts Joined: Jan 2007 From: Petaling jaya |
how to cheat hardware?o.O
|
|
|
Sep 7 2007, 06:30 PM
|
![]() ![]()
Junior Member
123 posts Joined: Nov 2006 |
QUOTE(HaHaNoCluE @ Sep 7 2007, 12:51 PM) FanBoy detected... Yeah, +1 to that. I almost wanted to get the 2900xt when it's out in May, but the bad early impressions deters me from doing so. That's 1 less customer for ATI Not saying that driver improvements over time is bad, but I believe the disappointed initial launch will somehow affect the sales... this is not a pc game, when I want to get a gfx, I don't wait for new drivers to improve the card's performance (can I be sure?), I will get what is good at that time... whereas for a game, I could always wait awhile for new patches to make the game better Just my humble 2 cents. |
|
|
Sep 7 2007, 08:29 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,955 posts Joined: Jan 2006 From: LlanfairÂpwllgwyngyllÂgogeryÂch |
QUOTE(HaHaNoCluE @ Sep 7 2007, 12:51 PM) FanBoy detected... I think what you said suits you a lot more.The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly. It delivers- yeah right. Fanboy detected. » Click to show Spoiler - click again to hide... « Now, back on semi on-topic. BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything. It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side. Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load. QUOTE(Games For Windows @ Sep 7 2007, 03:22 PM) Nvidia always had a knack for 'optimising'... which is great. I mean after all, if they indeed do something to a particular test/bechmark WITHOUT affecting the visuals and/or quality, why not? I remember the 3DMark cheat thingy... I was like "as if anyone actually cares if the apple is really full or not full, because all you see is an entire fruit". Nvidia 'cheats' ? That is definitely not nice to say... They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right?This post has been edited by X.E.D: Sep 7 2007, 08:32 PM |
|
|
Sep 7 2007, 08:30 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,042 posts Joined: Jan 2003 From: KL |
|
|
|
Sep 7 2007, 10:51 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,595 posts Joined: Jan 2003 From: Between Hell and Heaven |
erm.. u guys should noe OpenGL games rite?
Bioshock, Crysis.. i heard they are "openGL" rendered games... get wat i mean? |
|
|
Sep 7 2007, 11:11 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
10,544 posts Joined: Jan 2003 From: GMT +8:00 |
QUOTE(X.E.D @ Sep 7 2007, 07:29 AM) I think what you said suits you a lot more. These days i think many games/benchmarks are being "optimized" by both companies. In the past what they are doing would have been considered cheating, but definitions seem to change over time. Also is it because of NV specific optimizations that they are losing? You seem to know this for sure. Far Cry is a TWIMTBP game (and you claim that Crytek is a major NV house), yet previous gen AMD cards did much better than NV. In fact, the X1950 does so well in it, it's on par with the R600 in that game. Doesn't that itself point to something? Perhaps a problem. Perhaps their drivers are still bad. There are some nice gains coming in their next few drivers.The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly. It delivers- yeah right. Fanboy detected. » Click to show Spoiler - click again to hide... « Now, back on semi on-topic. BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything. It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side. Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load. They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right? QUOTE(storm88 @ Sep 7 2007, 09:51 AM) erm.. u guys should noe OpenGL games rite? They are dx titles, not OpenGL. You "heard" wrong.Bioshock, Crysis.. i heard they are "openGL" rendered games... get wat i mean? |
|
|
Sep 7 2007, 11:14 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
All Stars
14,990 posts Joined: Jan 2003 |
Well... all I know is my 8500GT cost Rm270 and plays the Bioshock demo well enough for me not to care. I'd like to see a Rm270 ATi card that can do just as well
|
|
|
Sep 7 2007, 11:25 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,139 posts Joined: Sep 2006 From: Internet |
QUOTE(X.E.D @ Sep 7 2007, 09:29 PM) I think what you said suits you a lot more. i totally agree... ATI is also better in Company of Heroes on DX9... The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly. It delivers- yeah right. Fanboy detected. » Click to show Spoiler - click again to hide... « Now, back on semi on-topic. BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything. It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side. Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load. They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right? not into DX10 yet... i wonder if my 7600GS can perform better than my 2600XT in Bioshock... *BTW... paying game developers to "optimize" the game engine for the payers' ain't wat i call deliver... it's called bribary... imagine if i can afford to pay the whole malaysia PDRM and the court juz to sue fanboys for being fanboys... tat's juz isn't fair... This post has been edited by t3chn0m4nc3r: Sep 7 2007, 11:28 PM |
|
|
Sep 7 2007, 11:30 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
10,544 posts Joined: Jan 2003 From: GMT +8:00 |
QUOTE(wodenus @ Sep 7 2007, 10:14 AM) Well... all I know is my 8500GT cost Rm270 and plays the Bioshock demo well enough for me not to care. I'd like to see a Rm270 ATi card that can do just as well Haha. At what resolution/settings? The 2400xt should be priced similarly and should be about as good at the same setting. In any case, at this level it doesn't really matter because you have to turn pretty much everything off to get a playable framerate. |
|
|
Sep 8 2007, 03:01 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
All Stars
14,990 posts Joined: Jan 2003 |
QUOTE(ikanayam @ Sep 7 2007, 11:30 PM) Haha. At what resolution/settings? The 2400xt should be priced similarly and should be about as good at the same setting. In any case, at this level it doesn't really matter because you have to turn pretty much everything off to get a playable framerate. 1280x1024 (monitor native resolution) -- all settings on high with Vsync on |
|
|
Sep 8 2007, 03:30 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
10,544 posts Joined: Jan 2003 From: GMT +8:00 |
|
|
|
Sep 8 2007, 04:33 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
8,647 posts Joined: Feb 2006 From: Jelutong Penang |
2400Xt @1280*1024...
i think u fell below 30FPS.. not playable for action or racing game |
|
|
Sep 8 2007, 09:24 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,595 posts Joined: Jan 2003 From: Between Hell and Heaven |
|
|
|
Sep 8 2007, 12:29 PM
|
![]() ![]() ![]() ![]() ![]()
Senior Member
986 posts Joined: Jul 2006 From: :: infront the steering :: |
yeah...
Nvidia the image quality is lower then the Ati... but for the performance...Nvidia wins... that what i know and experiance ar..... anyway....nvidia....i alwiz be with u.... hahaha |
|
|
Sep 8 2007, 02:37 PM
|
![]() ![]() ![]() ![]()
Senior Member
628 posts Joined: Oct 2006 |
QUOTE(X.E.D @ Sep 7 2007, 08:29 PM) I think what you said suits you a lot more. i dunno but maybe i am a fanboy... i'm a fanboy when i pay, i'll get what i want... not after wating for few xtra months... i do have a few ATI gpu too for ur info... i'm not saying ATI isn't good but just 2900xt didn't deliver for what most ppl expect after waiting for it to launch for 6 months... hell, the older 1950xtx works on par with the 2900xt in dx9 games when it was launch... o ya, not every1 play bioshock or call of juarez... n i'm still in XP... not vista... i've gone from 7950gx2 to 8800gts isn't coz 8800gts perform much better than the gx2 but it's juz my 975 mobo ain't frenly with it... i'm even getting a 2600xt for my htpc with abit f-190h mobo coz i wanted them in red color... as simple as dat, n 2600xt has the simpler hdmi jack.... so am i a fanboy of asking for performance/price ratio??? yes i am... The 320MB is a piece of junk, failing to run even Call of Juarez in DX10 smoothly. It delivers- yeah right. Fanboy detected. Now, back on semi on-topic. BioShock is an nVidia SPONSORED game, not optimized by ATi drivers or anything. It's just the same case as Lost Planet, game devs bought over to code deliberately FOR their cards (and in this case because both cards are very different, AGAINST ATi ones) in DX10. It's not cheating, but it's obviously anticompetitive and all. Futuremark's 3DM08 should be something better to compare to- it's not a game, but it's not using DX10 half-assedly or heavily supported by either side. Truth is, see any benchmark on Unreal Engine 3 in DX9 and it's obvious that the card you *didn't* buy wins. I did the long outlook, you did the current math. UE3 and other shader heavy titles are the major cornerstone of future games, DX9 or 10. Oh, and happy enjoying your slower card running at 90 degrees load. They cheated 3DMark03 with 16FP precision replacement IIRC. That does affects quality, otherwise everyone would be using FP16 instead of the DX9 FP24 standard, right? gpu maker sponsor games??? yeah, what's wrong with it??? well, if they dun sponsor, we might dun even have these games around... try no 1 sponsor F1 race... but taking the fully sponsored games as a benchmark to compare to other gpu makers card isn't so nice... over the time, better n greater cards r produced, so arguing wat certain gpu maker might improve in the next 9 to 12 months for current card, isn't wat users wanna think much... by dat time, we might oredi changed to newer card again... o, btw, anyone playing latest colin mcrae rally game??? wat gpu n wat fps u r getting when everything set to max with 1680 x 1050 resolution??? |
|
|
Sep 8 2007, 05:19 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,955 posts Joined: Jan 2006 From: LlanfairÂpwllgwyngyllÂgogeryÂch |
QUOTE(wodenus @ Sep 7 2007, 11:14 PM) Well... all I know is my 8500GT cost Rm270 and plays the Bioshock demo well enough for me not to care. I'd like to see a Rm270 ATi card that can do just as well 2600 PRO is only RM30 more, runs circles around the 8500GT.(Check all the benchmarks, they support the logic- not just in Bioshock, but other games too. 2600XT can beat 8600GT, 2600Pro is 3/4 of a XT, 8500GT is less than 1/2 of a 8600GT. Of course you won't care, since you already have the card. And it runs! I do pity nVidia somehow- DX9, where 90% of the market is, can't be "crippled" for any other GPU vendor by non optimmal (not non-optimized) code. DX9 was pretty much in spec as both vendors know what code they were facing against (well, besides Geforce FX- nvidia always right? RIGHT?) so they did have the balanced solution for whatever was next. (@ikanayam- Far Cry was shader power bound, IS fillrate bound. X1900s were shader monsters.) R600 and G80 instead are very different in the aptitude in which they handle certain code, and most code that nV "sponsors" do not take much use of R600's skinny stream units, leaving the fatty ones to do all the work. Lost Planet's was brutal. Optimizations in general? Sure. Deeper level un-crippling? Sure, IF AMD was the sponsor of BioShock. Now that the game's out, the code's done, IMHO little can be done to get the HD2900XT run faster than the GTX- at DX10, like it did in DX9. But keep in mind that nV can't sponsor every game. This post has been edited by X.E.D: Sep 8 2007, 05:30 PM |
|
|
Sep 8 2007, 05:38 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
All Stars
14,990 posts Joined: Jan 2003 |
QUOTE(ikanayam @ Sep 8 2007, 03:30 AM) I don't know Added on September 8, 2007, 5:54 pm QUOTE(X.E.D @ Sep 8 2007, 05:19 PM) 2600 PRO is only RM30 more, runs circles around the 8500GT. (Check all the benchmarks, they support the logic- not just in Bioshock, but other games too. I don't play benchmarks QUOTE(X.E.D @ Sep 8 2007, 05:19 PM) 2600XT can beat 8600GT, 2600Pro is 3/4 of a XT, 8500GT is less than 1/2 of a 8600GT. Of course you won't care, since you already have the card. And it runs! Exactly.. it's Rm270 and it plays the latest games reasonably well. In fact I'm quite surprised.. I mean it's Rm270, and it's not like 0.1fps with all the settings on high and vsync on.. and those were default too. Bioshock isn't that great a game though, they limit your freedom a lot. You have no choice but to follow this Atlas guy. It's got a horror-house quality about it, they push you through the game like it's an interactive guided tour. QUOTE(X.E.D @ Sep 8 2007, 05:19 PM) I do pity nVidia somehow- DX9, where 90% of the market is Is, not will be... QUOTE(X.E.D @ Sep 8 2007, 05:19 PM) (@ikanayam- Far Cry was shader power bound, IS fillrate bound. X1900s were shader monsters.) Wow Far Cry.. that's so ancient they're giving it away free QUOTE(X.E.D @ Sep 8 2007, 05:19 PM) But keep in mind that nV can't sponsor every game. Hey it's not like NV is the only one in that game (ha !) -- there are ATi optimizations in 2moons... pity it's boring as all heck PS. Totally loved World in Conflict. It's quite fun. That's a much better game than Bioshock (at least for me, I'm more of a strategy gamer that a FPS-er This post has been edited by wodenus: Sep 9 2007, 11:30 AM |
|
|
Sep 8 2007, 06:30 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,955 posts Joined: Jan 2006 From: LlanfairÂpwllgwyngyllÂgogeryÂch |
@Wodenenus
You ASKED for something equivalent, so I kindly came out with one that runs games pretty well too- even the nV sponsored one |
|
|
Sep 8 2007, 09:32 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
All Stars
14,990 posts Joined: Jan 2003 |
|
|
|
Sep 9 2007, 12:40 AM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
10,544 posts Joined: Jan 2003 From: GMT +8:00 |
Haha apparently you do not know wodenus.
QUOTE(X.E.D @ Sep 8 2007, 04:19 AM) DX9 was pretty much in spec as both vendors know what code they were facing against (well, besides Geforce FX- nvidia always right? RIGHT?) so they did have the balanced solution for whatever was next. Please elaborate. How is this different from now? How did they know then and suddenly not know now? It's not like dx10 spec was suddenly pushed upon them. It's not like the dx10 games were developed in 3 weeks. So what is your point?QUOTE(X.E.D @ Sep 8 2007, 04:19 AM) You completely missed the point. R600 is vastly superior in ALL those measures (at least on paper). So why is it losing to a x1950? If you don't see that as a sign of a problem somewhere (drivers and/or hardware), then i'm not sure what to say...QUOTE(X.E.D @ Sep 8 2007, 04:19 AM) R600 and G80 instead are very different in the aptitude in which they handle certain code, and most code that nV "sponsors" do not take much use of R600's skinny stream units, leaving the fatty ones to do all the work. Lost Planet's was brutal. Optimizations in general? Sure. Deeper level un-crippling? Sure, IF AMD was the sponsor of BioShock. Now that the game's out, the code's done, IMHO little can be done to get the HD2900XT run faster than the GTX- at DX10, like it did in DX9. Most intriguing. Can you give an example of what they do that takes uses mostly the "fat" units and does not use the "thin" units? And how is G80 different in handling such cases? Be as detailed as possible.But keep in mind that nV can't sponsor every game. |
|
|
Sep 9 2007, 02:52 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
2,042 posts Joined: Jan 2003 From: KL |
Maybe u should explain to him instead as an engineer.
|
|
Topic ClosedOptions
|
| Change to: | 0.0281sec
0.44
5 queries
GZIP Disabled
Time is now: 21st December 2025 - 06:56 AM |