QUOTE(Faint @ Nov 5 2010, 05:15 PM)
+1hd5850 just lack 3d support and better crossfire scalling, the only think to consider hd6850 if your comp only have one six pin power connector or you want to oc hd6850 to hd6870 performance
AMD Radeon™ Discussion V4, Cut Cut Cut!~ Go Red !
|
|
Nov 5 2010, 05:44 PM
Return to original view | Post
#1
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
|
|
|
|
|
|
Nov 11 2010, 09:58 AM
Return to original view | Post
#2
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(DarkSilver @ Nov 11 2010, 04:31 AM) Wait for the Radeon HD6900-Series. i thought the media says that releasing date for hd6900 are delayed..nah...i hope they just bluffing...just hope they can get at least 25% performance increase for the same die size (but cayman die size maybe more)Because the releasing date is near too. Sometime, you might get price cut because AMD and Nvidia is competing like hell. |
|
|
Nov 11 2010, 04:14 PM
Return to original view | Post
#3
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
leaked cayman spec:
![]() shocking 1920shader with 3.5tflops if this is true, then the die size will be really big...hopel it wont cost as much as gtx580.. there are also rumor that hd6970 are faster than gtx580 if direct comparison with hd6870, the hd6970 have more than 70% increase of raw shader power AMD FERMI anyone? This post has been edited by zerorating: Nov 11 2010, 04:20 PM |
|
|
Nov 11 2010, 05:54 PM
Return to original view | Post
#4
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(Boldnut @ Nov 11 2010, 05:12 PM) It would be hard to imagine 6990 will be. and I dont think it will be dual 6870, because thats gonna be equal performance with 5970. erm...amd targetted <300w for cayman xt Base on this fact, Cayman will not beat 580 because it need to fit the TDP for dual card. but AMD might just use 6990(dual underclock cayman) to beat GTX580. why cannot beat gtx580 when the card have newly architecture (use of 4 medium complexity shader compared to 1 wide and 4 low complexity shader on hd4000) while gtx580 are just fixed fermi. You know that cayman RnD already been going for 1 year after the hd5000 was release. I bet they using 2 cayman pro on hd6990. With barts crossfire scalling, this hd6990 gonna be monster This post has been edited by zerorating: Nov 11 2010, 05:58 PM |
|
|
Nov 12 2010, 10:36 PM
Return to original view | Post
#5
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(Boldnut @ Nov 12 2010, 09:14 PM) If Cayman XT is 300w, then AMD will not able to make hd6990 within 300w TDP even with 2 underclock cayman. U dont want a HD6990 become a 500w graphic card. im just following this image, nothing personal The way I see it is Cayman vs GTX580 is going to be like 5870 vs GTX480. If AMD want to fit 2 Cayman chip within 300w TDP. It just mean they cannot make a huge high power chip alone, this doesnt even fit AMD "Sweet spot design". Even tho Cayman may be more efficient than Fermi, but I dont think it can be twice as efficient as Fermi. Cayman performance will be about 80-90% of GTX580. The delay is more like because AMD take another month to look into any additional tweaks to get extra performance, so Cayman performance wont fall too far away from GTX580 & repeat the success of 5870. ![]() since cayman already use 8+6pin connector, it was expected that the antilles will be more than 300w,with the tdp that was mention, the performance of cayman xt may be a little higher than gtx580 |
|
|
Nov 12 2010, 11:38 PM
Return to original view | Post
#6
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(law1777 @ Nov 12 2010, 11:26 PM) well fuad may just bluffing since they assume amd delay because of gtx580 performance..however, if the gpu just as big as cypress and much lower power consumption, slower than gtx580 are acceptable This post has been edited by zerorating: Nov 12 2010, 11:42 PM |
|
|
|
|
|
Nov 14 2010, 05:29 PM
Return to original view | Post
#7
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
|
|
|
Nov 21 2010, 10:34 PM
Return to original view | Post
#8
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
guys another leaked hd6970 leaked spec..
![]() only 32 rops and 160gbps memory however hd6970 are nearly come p.s the slide spec are similar to old leak spec ![]() This post has been edited by zerorating: Nov 21 2010, 10:45 PM |
|
|
Nov 21 2010, 11:20 PM
Return to original view | Post
#9
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(victor2212 @ Nov 21 2010, 11:14 PM) I think I'll upgrade when 7 series comes out. when? next year november/disember? since you already have 5k series better skip it, and wait for >300mm 28nm chipAdded on November 21, 2010, 11:14 pmI think I'll upgrade when 7 or 8 series comes out. the first 28nm chip from amd will gonna replace the current amd juniper (hd5770) This post has been edited by zerorating: Nov 21 2010, 11:20 PM |
|
|
Nov 21 2010, 11:47 PM
Return to original view | Post
#10
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
|
|
|
Nov 22 2010, 01:06 AM
Return to original view | Post
#11
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(Bettyem @ Nov 22 2010, 12:34 AM) Want to ask depends on your cpu, my dual core rig only utilize <20% gpu usage, hence gathering low fps on gamewhich ati graphics card can ply max setting @1920x1080 without any lag on call of duty black ops? once you have the right cpu, even hd4850 can handle the game well This post has been edited by zerorating: Nov 22 2010, 01:06 AM |
|
|
Nov 22 2010, 02:18 PM
Return to original view | Post
#12
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
|
|
|
Nov 22 2010, 02:20 PM
Return to original view | Post
#13
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
|
|
|
|
|
|
Nov 22 2010, 03:13 PM
Return to original view | Post
#14
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
hey guy, leak spec of antilles (hd6990), confirm 2 cayman chip
![]() 6 tflops of power with 300w tdp? even gtx580 have full load power of 350w, i wondered what their dual-gpu card offering This post has been edited by zerorating: Nov 22 2010, 03:16 PM |
|
|
Nov 23 2010, 09:47 AM
Return to original view | Post
#15
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(tech3910 @ Nov 23 2010, 09:25 AM) HD 6990 spec finalized dont forget hd6900 series are completely new design, dual resterizer, 4d shader arch, double geometry compared to cypress..http://www.hardware-infos.com/news.php?news=3767 it's in german, use google translate. Added on November 23, 2010, 9:37 am there is chance of it being faster. gtx 580 is 30% faster then 6870. a 6870 has 1120 shader, which is 58.3333% of 6970 total shader. extra 41.6667% shader on 6970 most prbably will translate to @ least 30% performance boost, or even more. hd6800 are liek redefined cypress but using smaller die size r600 v2 maybe This post has been edited by zerorating: Nov 23 2010, 09:52 AM |
|
|
Nov 23 2010, 12:07 PM
Return to original view | Post
#16
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(Demonic Wrath @ Nov 23 2010, 11:54 AM) Uh-oh.. someone's got an explanation to do... Looks like AMD is downgrading the IQ for better performance.. optimization are normal as long as the worse IQ are not visible, if user see, they can always turn off Catalyst AI(lower fps), which is impossible to been turn off on nvidia cardTesting NVIDIA vs AMD Image Quality Source: AMD Radeon HD6870 test nvidia was also caught cheating before catalyst 10.10e have 4 setting for catalyst ai for users taste compared to two setting (standard and quality) on older driver This post has been edited by zerorating: Nov 23 2010, 12:09 PM |
|
|
Nov 23 2010, 12:19 PM
Return to original view | Post
#17
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(Demonic Wrath @ Nov 23 2010, 12:12 PM) Yes, optimization is normal. AMD does make the option to turn off Catalyst AI, but usually in benchmarks, the testers do not change the settings of the CP. This will result in higher performance in benchmarks, no? If it's okay to be like that, then AMD and NVIDIA should make the default settings to be the lowest quality so their performance is at its best. how many fps difference you get went selecting different setting of IQ, yup i giving thumb up for nvidia, since they use quality setting first when driver was installedsome stated that ComputerBase and PCGH use 'high' setting of catalayst ai compared to default 'quality' setting dont forget nvidia lie on 3dmark cpu test for running phsyx on gpu, hence more 3dmark score This post has been edited by zerorating: Nov 23 2010, 12:26 PM |
|
|
Nov 23 2010, 12:40 PM
Return to original view | Post
#18
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(DarkSilver @ Nov 23 2010, 12:36 PM) That's why most 3DMark Vantage Competition had forbid users to use Nvidia GPU to run PhysX test. for cpu test,its suppose to bench cpu not gpu Because the score they're getting will be like 2x or 3x more than usual. This post has been edited by zerorating: Nov 23 2010, 12:41 PM |
|
|
Nov 23 2010, 02:24 PM
Return to original view | Post
#19
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(Demonic Wrath @ Nov 23 2010, 12:53 PM) I've tested in benchmarks and got around 5% improvement going from Quality to High Performance [NVIDIA CP]. just recently i use to change texture quality (use 8600m gt) on resident evil (fixed benchmark)quality-46.1fps performance-46.3fps high performance-46.6fps not really 5% much, are u use quality, performance setting from "Adjust image quality with preview" and not Textture filtering - Quality from "Manage 3D setting" from nvidia cp? sorry out of topic,mod please remove this if unrelevant This post has been edited by zerorating: Nov 23 2010, 02:30 PM |
|
|
Nov 23 2010, 02:42 PM
Return to original view | Post
#20
|
![]() ![]() ![]() ![]() ![]()
Senior Member
975 posts Joined: Aug 2007 From: Lokap Polis |
QUOTE(DarkSilver @ Nov 23 2010, 02:39 PM) But, I can't notice the Graphics Difference(by bare eyes) between High Performance vs Quality/High Quality for Nvidia GPU. same as me,when playing game,we looking at moving image not static image So, setting it up for High Performance is better. Similar to AMD/ATI, there's also significant Graphics Difference between High Performance vs High Quality. AMD and Nvidia both have different optimization on every games. Some games favour Nvidia and some to AMD. Similarly, the Graphics Quality too. This post has been edited by zerorating: Nov 23 2010, 02:42 PM |
|
Topic ClosedOptions
|
| Change to: | 0.0422sec
0.53
7 queries
GZIP Disabled
Time is now: 15th December 2025 - 11:21 AM |