but ill wait for my 8800gt
Radeon HD 3870 and HD 3850 On The Horizon, The Dark Side is POWERFUL . come join
Radeon HD 3870 and HD 3850 On The Horizon, The Dark Side is POWERFUL . come join
|
|
Nov 26 2007, 01:02 AM
Return to original view | Post
#1
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,313 posts Joined: Jan 2006 |
this card is being priced at around $180 compared to $250 for a 8800gt (street prices, comparing at online sites). id say it's a good deal for those looking into performance.
but ill wait for my 8800gt |
|
|
|
|
|
Nov 26 2007, 02:13 AM
Return to original view | Post
#2
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,313 posts Joined: Jan 2006 |
i have AMD x2 3800 2ghz with 2gb ram. if i upgrade my card to Radeopn 3870 or GF 8800gt do you think my CPU can keep up with crysis? or even the recently released games like COD4 or bioshock?
|
|
|
Nov 26 2007, 04:17 PM
Return to original view | Post
#3
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,313 posts Joined: Jan 2006 |
|
|
|
Nov 29 2007, 02:47 AM
Return to original view | Post
#4
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,313 posts Joined: Jan 2006 |
QUOTE(empire23 @ Nov 28 2007, 01:28 AM) Next Geforce high end part will not be G92 right now. It will use the G80 IIRC due to the need for high clockspeeds and the supply and clocking issues with the G92. cant be true, why would they use a more expensive process - 90nm - to create a card which undoubtedly require a larger amount of transistor? not to mention heat due to the larger die and its presumably lower clocking capability. im not familiar with the G92 clocking issues, but it sure is clocking higher than the g80. i find ure logic terbalikQUOTE(ikanayam @ Nov 28 2007, 02:55 AM) It will be a non-cutdown G92 for the faster card coming soon. G80 has been EOL'd, they're just selling whatever old stocks they have. what is a full version of G92? increased number of SP and higher memory bandwidth?anyone who buys a g80 GTS and gtx now and the past 2 weeks deserves to be laughed at. |
|
|
Nov 29 2007, 06:36 AM
Return to original view | Post
#5
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,313 posts Joined: Jan 2006 |
QUOTE(empire23 @ Nov 28 2007, 04:25 PM) Hear it's internally called the G80 plus. i dont recall any high end latest gen graphics card using the previous gen chip nor processes. GeForce 7800 GTX used 110nm and the GeForce 7900 GTX used 90nm. just look through the R520 and R600 series and ull see the same pattern. also, isnt the Ultra also using a 90nm process which is the pinnacle of nvidia's offering at that time?The idea isn't the larger amount of transistors or whatnot. Highend parts have only 2 things in mind, yield, and clockspeed, and if the old process is unable to deliver on both, graphic card makers usually aim for a combinational process. the G92 made in 90nm is a possibility, smaller processes don't really mean you'll get better clocks or usable chips per wafer. That's why the latest high enders of next gen cards usually use last gen processes because they're proven and easily optimized for. if they could have made G80 go beyond 612 mhz without producing too much heat, they would. and maybe they are working on it hence the G80 plus. but currently G92 is churning less heat with its 65nm process, packing more transistors, and boasting better clocking performance with GTS coming out at 650mhz on a reference chip. now that they are capable of producing G92 using 65nm process, it's just insane to go back to back to using 90nm. if ure talking about optimizing G80+ on 65nm then that sounds plausible. but staying at 90nm? |
|
Topic ClosedOptions
|
| Change to: | 0.0437sec
0.57
7 queries
GZIP Disabled
Time is now: 12th December 2025 - 02:04 AM |