QUOTE(Unseen83 @ Feb 26 2015, 02:48 PM)
http://wccftech.com/dx12-nvidia-amd-asynchronous-multigpu/
AMD Radeon™ Discussion V12, Latest - 14.12 | WHQL - 14.12
|
|
Feb 26 2015, 08:49 PM
Return to original view | Post
#1
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(Unseen83 @ Feb 26 2015, 02:48 PM) http://wccftech.com/dx12-nvidia-amd-asynchronous-multigpu/ |
|
|
|
|
|
Mar 3 2015, 04:34 PM
Return to original view | Post
#2
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
|
|
|
Mar 5 2015, 12:03 AM
Return to original view | Post
#3
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(Acid_RuleZz @ Mar 4 2015, 10:22 PM) Just check those online bench for MSI R9-290x Gaming. The R9 290X Gaming is fitted with old Twin Frozr IV HSF, it wasn't really meant to handle the heat produced by R9 290X but at least it's quiet.Max temp on my 1st stress test is 92c. Couldn't wait anymore bruh. Tido tak lena, mandi tak basah. Lel. I'm not really a fan of the small fans pun intended and the cooling performance not that great iirc. I like the aesthetic though. |
|
|
Mar 22 2015, 07:47 PM
Return to original view | Post
#4
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(Unseen83 @ Mar 22 2015, 07:14 PM) From what I know screen tearing can't be captured through screenshot. And it doesn't look like that either If any of you feeling brave you can overclock your refresh rate, by using software like cru (custom resolution utility). I got mine to 75Hz, that eliminate screen tearing for me and also improve fast paced gaming |
|
|
Mar 30 2015, 08:12 PM
Return to original view | Post
#5
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(Unseen83 @ Mar 30 2015, 07:10 PM) add-on: That is only applicable to R9 290X and NOT R9 290. You can read the review here. Like terradrive said the switch merely for either legacy BIOS or UEFI BIOS.“Quiet Mode” – Bios position one. Switch is in position closest to where you plug in your displays. This mode is designed to optimally suit a gamer that wants to keep a tight lid on acoustics. If you do not play with headphones, you do not have a high end gaming chassis, or your room’s ambient noise level is extremely low this may be the mode for you. “Uber Mode” – Bios position two. Switch is in position furthest away to where you plug in your displays |
|
|
Mar 30 2015, 10:51 PM
Return to original view | Post
#6
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(Unseen83 @ Mar 30 2015, 10:15 PM) |
|
|
|
|
|
Apr 13 2015, 06:57 PM
Return to original view | Post
#7
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
|
|
|
Apr 13 2015, 10:10 PM
Return to original view | Post
#8
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(terradrive @ Apr 13 2015, 09:03 PM) Then isn't it easier to remember that: Your equation is wrong. Power consumption depends on load. P=IV, the higher the load the more current the component needs hence higher power usage, even if the voltage stays the same but if the load is low it consume less current hence less voltage. Different games puts different load on GPU. The Maximum graph you put is using synthetic power virus Furmark program which is irrelevant.Total power consumption = heat generated + actual work done by the component Then the power consumption can be higher Bear in mind that there's hardly 280x reference cards, most of them are slightly overclocked like on this gigabyte card. the 280 is higher clocked version than the 7950 boost. better safe than sorry, some game's UI is crazy like on AS Unity's menu screen in game, the GPU ramps up to crazy heat and power consumption if didn't turn on Vsync AC Unity menu is bad example as it's just that game have that problem. His most likely problem is vBIOS because he already fixed it after people recommending him to update it. The flickering issue on 280 cards are well known all over the internet. |
|
|
Apr 13 2015, 10:59 PM
Return to original view | Post
#9
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(terradrive @ Apr 13 2015, 10:42 PM) Dude, my equation is totally irrelevant with yours. It is totally different thing. It's not an entirely different matter, it's basic electronics. Electrical stills play with current and voltages. The main question is TDP which is still means Thermal Design Power is is NOT equals to power consumed. Also TDP varies from one company to another, AMD rated it's CPU TDP differently from Intel, so does nvidia and AMD GPU.The equation I posted is really basic in terms of engineering. P = IV is describing an entirely different matter. Furmark is the max power consumption the card can ever get, however one and rare example of AC Unity menu bug can cause abrupt increase of power consumption. Such thing is very hazardous to low end power supplies, some of which can even burnt out because lack of proper overpower overload protection. Also notice the peak power is at 244w, that's is achieved on crysis 3 IINM. I saw the posts about the flickering and glad about him fixing it. But the recommended PSU requirements by AMD for 280X is still 750watts, that's far cry of the current 500 watts. Nobody 'plays' Furmark', as I said it's irrelevant because some GPU automatically detect this porgram and put a limiter to prevent the card from overheating or using too much power. The peak power is from Crysis 2 which clearly shows on top of the page. You are delusional. Show me where does AMD stated that R9 280X requires 750W on SINGLE CARD. Read the review on Guru3D on R9 280X TwinFrozr power consumption. They stated what are the recommended PSU down there. |
|
|
Apr 14 2015, 11:16 AM
Return to original view | Post
#10
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(terradrive @ Apr 14 2015, 09:15 AM) Who is the one that is delusional here? How long do you stay on menu? Can you see why I say you delusional over one measly game with it's menu bug?First, Assassin's Creed Unity have the menu bug which draws big power from the card, why did you conveniently dropped this from your argument? Answer this. QUOTE(terradrive @ Apr 14 2015, 09:15 AM) Second, the peak power shown is 239 watt from your link, 244 watt from my link. It is near 250 watt as claimed by AMD. That is not small power figure I agree but did he mentions any problem from that list? He even say there is problem with Chrome, whic is not game related.Oh hey, how convenient that the page you linked stated this: ![]() I guess no need to play safe, just let the PSU burnt out and take some components out is all right and dandy QUOTE(terradrive @ Apr 14 2015, 09:15 AM) Please, games aren't the only ones draining huge power from GPU. Wha? How it comes to this? It's OpenCL on OpenGL. We are discussing games not GPGPU performance or it's power consumption.If you are so sure about the card will automatically throttle because it was designed that way, why don't you help those manufacturers write their BIOS? gtx 970 & 980 were touted as power sipping GPUs, look at how much power the aftermarket cards use when running cuda. Same thing with Radeons be able to run OpenGL. QUOTE(terradrive @ Apr 14 2015, 09:15 AM) Oh hey, how convenient that the page you linked stated this: And you right now starts to pick every single 280X cards for no obvious reason. The guy with the problem himself saying his card is fine now. I'm still waiting for official AMD saying single R9 280X cards requires 750W PSU like you stated before.I guess no need to play safe, just let the PSU burnt out and take some components out is all right and dandy Third, cherry picked MSI card with stated lower PSU requirements? Please. Wow this is interesting, Asus stated power consumption up to 300 watts I'm conveniently leaving out Gigabyte 280X which states 600 watts PSU requirments, because it's to follow your style of cherry picking. QUOTE(terradrive @ Apr 14 2015, 09:15 AM) Fourth, in CPU, people doesn't use P = IV. People use a more specific formula that is P = CV2f. If you didn't know then stop talking about current and voltages. I don't want to go to specifics on equation. I just put the most basic formula. QUOTE(terradrive @ Apr 14 2015, 09:15 AM) Look at his power supply, the Gigabyte Powerrock 500w specifications: Agreed that his PSU is crappy but if he say is all good. Why don’t you tell him?2 rails of 12v with 18 amps each, and total of 432 watt delivered stacked on 12 volt rails. So technically his power supply isn't 500 watts. Not to mention 18 amp rail can deliver a theoretical 216 watts of power. That is already too close for comfort. For high end cards that has TDP of 250 watts it is usually recommended to have at least 24 amps on the 12 volt rails. Heck a good quality Seasonic G 450 watt have 37 amps on the 12 volt rails. |
|
|
Apr 17 2015, 09:26 PM
Return to original view | Post
#11
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(hardcorie @ Apr 17 2015, 03:09 PM) i have buy Lenovo z50 AMD FX 7500 R7, i knowledge that the APU radeon use to share system memory and i want to know the another GPU radeon r7 260m dx is have its own memory or shared with system memory? You are right, FX-7500 is an APU with integrated graphics, and the R7 260MDX have it's own 2GB dedicated memory. You can double confirm with GPU-Z. |
|
|
Apr 18 2015, 10:54 AM
Return to original view | Post
#12
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(Acid_RuleZz @ Apr 18 2015, 09:07 AM) Oh, my bad. I forgot it's a Kaveri. When i see FX i instantly though it doesn't have integrated gpu. AMD bring FX naming to mobile but they just renaming their Kaveri platform and people confused by it. IMO they shouldnt do it, FX brings bad omen to AMD ever since they use it. |
|
|
Apr 20 2015, 07:58 AM
Return to original view | Post
#13
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(Unseen83 @ Apr 19 2015, 05:01 PM) if R9 390 out, good price or rm600-900 lower than competitor with new HBM Memory tech Nah it wont be that much price difference. New cards plus new memory wont be cheap, plus GST.[url=http://wccftech.com/amd-talk-graphics-products-quarte Freesync is still at its infant state. Wait a few months for it to be more mature. I agree that higher refresh rate isnt that important because one of the reason of it is to achieve fluidity at lower framerate at high resolution such as 4k and above, other than to eliminate screen tearing of course. |
|
|
|
|
|
Apr 20 2015, 11:36 AM
Return to original view | Post
#14
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(S4PH @ Apr 20 2015, 11:12 AM) Agreed new gpu will cost a bomb with gst, plus dx12 will improve older hardware, then only fx cpu can use up 6 cores, might just wait n see before changing to newer gpu technology, but I still game at 1080p no moolah for 4k panel I pretty much going to stick with 1080 for a long time. I didnt find the extra pixel appealing. Good news is because high end card now focused on 4K we on fullHD resolution can opt for mid range cards that are now cater for that resolution, as an added bonus buying high end cards for that resolution means we could crank up all the eye candy ingame, plus the VSR or DSR if needed |
|
|
Apr 20 2015, 12:04 PM
Return to original view | Post
#15
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(Unseen83 @ Apr 20 2015, 11:43 AM) hm.. Seller in Malaysia(NOT all of them) alot of them Jack up the price e.g. look at GTX Titan X RM4,xxx+ this is before GST was implemented... is on Lynet NEWSUpdate... if they give out the usual excuses "US Currency Exchange Fluctuation" (even so distro got it goods from china and all product produce in China "feel so suspicious/greedy") it shouldn't be more RM3.9K AND if Nvidia recommend 999USD does not mean Seller not getting it % cut with selling of base recommended price... That is the reason I refuse to buy anything brand new after GST. I noticed this stupid trend where even before GST they jack up the price like you mentioned. Not all the shop do this but its annoying when my favorite shop did it.http://www.lowyat.net/2015/03/nvidia-gefor...ore-than-rm-4k/ And for your info AMD always Present a better competitive price over it's competitor... e.g. GTX 980 and R9 290 performance only 6-9FPS or some game no difference... but in price... is 100usd - 250usd difference... The reason why AMD R2 290 series is cheap is because it's old. If you open up GPU-Z you can see the announced date. The card is approaching 2 years old (290X released on Oct 2013!!)! AMD cuts their price after GTX 900 series comes out around Sept 2014. |
|
|
Apr 20 2015, 12:24 PM
Return to original view | Post
#16
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
|
|
|
Apr 20 2015, 12:28 PM
Return to original view | Post
#17
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(sai86 @ Apr 20 2015, 12:22 PM) but performance aint old. love my tri-x 290 now. Yup because AMD builds the card with enormous 512-bit memory and 64 ROP the performance is awesome. When you read the review 290 cards handle high resolution with AA better than nvidia cards. Shroud noise? Whats the problem with your card?the only thing to do now is to find rubber and fix the god damn shroud noise |
|
|
Apr 27 2015, 07:24 AM
Return to original view | Post
#18
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(Unseen83 @ Apr 26 2015, 11:53 AM) agree Vulkan is not Mantle, it is the next generation OpenGL with similar goals like Mantle and DirectX 12. |
|
|
Apr 27 2015, 01:08 PM
Return to original view | Post
#19
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
QUOTE(Unseen83 @ Apr 27 2015, 11:42 AM) |
|
|
May 1 2015, 09:38 AM
Return to original view | Post
#20
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
5,211 posts Joined: Feb 2005 From: Konohana |
|
|
Topic ClosedOptions
|
| Change to: | 0.0603sec
0.54
7 queries
GZIP Disabled
Time is now: 4th December 2025 - 07:38 AM |