QUOTE(xFreedomBoix @ Jan 9 2015, 07:27 PM)
hi hehe... wanna ask the owners of sapphire r9 290 tri-x,
what's ur idle and load temperature of the card?
ambient temp 33C no airconwhat's ur idle and load temperature of the card?
idle ~46C
load ~77C
AMD Radeon™ Discussion V12, Latest - 14.12 | WHQL - 14.12
|
|
Jan 9 2015, 08:03 PM
Return to original view | Post
#1
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
|
|
|
|
|
|
Jan 9 2015, 09:52 PM
Return to original view | Post
#2
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(CwwKiT @ Jan 9 2015, 09:09 PM) Guys , how's this card? Any issue with this card? If there's any problem do I have to send it back to power color? I bought HD5830 from ipmart before, no problems with the card and it's running fine.HD 6950 |
|
|
Jan 15 2015, 10:36 AM
Return to original view | Post
#3
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
300w is nothing. The most important thing is they are able to cool it down.
I Oced my 290 and it can pull over 300w also no problem running all day also. Even OCed GTX 980 can pull alot of watts too. In gaming GTX 980 uses less power, but in CUDA apps even stock third party cards can be near 300w too ![]() In gaming the GTX 980 only uses around 180w. ![]() In CUDA or OPENCL, the power consumption of the third party card is 280w! Reference GTX 980 has lower wattage limit and it ran at 180 watt. Still both higher than the 165w TDP claimed by Nvidia. Source http://www.tomshardware.com/reviews/nvidia...ll,3941-12.html ![]() This is reference GTX 980 which has lower wattage limit. Under Nvidia's own 25% OC pushing the core to 1377Mhz, mem to 1466Mhz, max boost to 1515Mhz under the same voltage. The GTX 980 uses slightly more power than reference R9 290X. Now 25% more for 165w of the GTX 980 should be 206w, but why does a 206w card now uses more power than R9 290X which is claimed to be 250w - 300w TDP card? The card here is for the reference card, third party GTX 980 will be chugging power just like r9 290 series once you try to overclock them. Nvidia's TDP figures is questionable while AMD's TDP figures is exaggerated. Source http://www.anandtech.com/show/8526/nvidia-...x-980-review/22 For those people who think they can run GTX 980 and GTX 970 with small PSU, they better be careful lol, especially if they get curious and try to overclock the card. They will face problems too if they try to run CUDA apps with small PSU. The bottom line is, 300w is nothing. This post has been edited by terradrive: Jan 15 2015, 10:38 AM |
|
|
Jan 16 2015, 09:58 AM
Return to original view | Post
#4
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
|
|
|
Jan 16 2015, 10:55 AM
Return to original view | Post
#5
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(gogo2 @ Jan 16 2015, 10:36 AM) Ok, that can be done. Quality PSU can put out the rated wattage without regard to which rail.I want to ask. If let's say a R7 240. It is 50W GPU without the need of 6 pin PSU connector. I have A68N-5000 board with AMD A4-5000. The TDP also around 30W. Total power tdp = 50W + 30W = 80W. Will it work with 150W PSU running at 78% efficiency? I saw 4 PIN connector on the board which I assume is powering the PCI-E 4X lane? I believe its quite dangerous because TDP above does not assume MAX TDP when both CPU and GPU is 100% loaded. So my question is, will it be more safe to run a GPU like AMD R9 280X with 2nd PSU 6 pin power? Please help me choose:- 1) A68N-5000 + R7 240 with 150W PSU 2) A68N-5000 + R9 280X with 150W PSU + 2nd 400W PSU (R9 280X is just example. I will use lower end GPU of course lol) For example my Seasonic X-850 can put out 850 watts just on the 12V rails. 150W PSU with 78% efficiency doesn't mean it can only put out 117W power, but means they can put out 150W to computer's components but pulls 192W from the wall socket. If I have to choose better run one PSU only. Good 450-550W PSU is cheap now. Cheap PSU are rated differently and usually combined wattage of 3.3V, 5V and 12V. The 12 V rails is probably alot less wattage than rated. It's better to get higher wattage PSU because PSU degrades over time. Also the components that are rated as 30W, 50W is the average power consumption, the power consumption is actually goes up and down very fast in miliseconds. The peak power usage may go over the amount and the PSU may not be able to handle it. I faced the same problem when I was using my old Cooler Master Extreme 2 625, it can only supply ATX rated power until 480W, and after 1 year of usage my computer always crashes even when the GPU was idling. Then I switched to Seasonic and very stable now. I put the Cooler Master PSU on AMD II X4 635 + HD5830 and it's running fine. This post has been edited by terradrive: Jan 16 2015, 10:55 AM |
|
|
Jan 21 2015, 08:52 AM
Return to original view | Post
#6
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(Ryeeson @ Jan 21 2015, 07:09 AM) Guys, Did you try third party software such as Sapphire Trixx or MSI Afterburner to adjust the fan speed?Have you ever experience your r9 290 fan limits its fan speed? Mine only run at 1070rpm max. This will cause the core to declock from 947mhz to about 300mhz at94 degree C temp. I have tried with afterburner... Flash performance bios, switch to uber, enable overdrive... And etc. This gc of mine seem defected with maximum fan speed at 1070 rpm. I know the fan can spin up to 2650rpm at max. When booting, I can hear the fan spin at higher rpm but when doing uningine test, or playing game, it's just not responding to the increases of temp. |
|
|
|
|
|
Jan 21 2015, 01:15 PM
Return to original view | Post
#7
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
|
|
|
Jan 22 2015, 02:44 PM
Return to original view | Post
#8
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(faidz85 @ Jan 22 2015, 10:34 AM) I am using this mobo: Some high performance coolers for 290/290X only uses 2 slot like the Sapphire Tri-Xhttp://www.gigabyte.my/products/product-pa...spx?pid=4491#ov I don't remember it quite clearly but I remember that the sound card is so close to the GPU and don't think anything thicker than 2 slot cooler can fit. But I will check it once I get home. |
|
|
Jan 23 2015, 03:00 PM
Return to original view | Post
#9
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(wongkoewei @ Jan 23 2015, 12:09 PM) Both MSI and Sapphire website show that the minimum requirement of PSU are 750W Now near the end of January already, if me I'll wait for 300 series. Your HD7870 still can play latest games too.http://www.msi.com/product/vga/R9-290X-LIG...o-specification http://www.sapphiretech.com/presentation/p...n=&lid=1&leg=0# Since R9 300 series is approaching, do you think 290X still be a viable purchase? I would like to make a final upgrade before our Government April fool present :GST My current system spec: i5-4670K 16GB DDR3 1600MHz value RAM Giga Z87X D3H Sapphire HD 7870 GHz 1.5TB + 2TB WD green Cooler Master V6GT CPU Cooler Cooler Master Silent Pro M2 620W Corsair Carbide 400R If tak boleh wait better for for cheap GTX970... (lol i know it's Radeon subforum) or get a 2nd hand r9 290? |
|
|
Jan 24 2015, 10:20 AM
Return to original view | Post
#10
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
|
|
|
Jan 24 2015, 10:24 AM
Return to original view | Post
#11
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
|
|
|
Jan 25 2015, 10:05 AM
Return to original view | Post
#12
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
I thought 380X is slated around June and still 28nm?
This post has been edited by terradrive: Jan 25 2015, 10:06 AM |
|
|
Jan 26 2015, 08:57 PM
Return to original view | Post
#13
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(wongkoewei @ Jan 26 2015, 06:59 PM) Wow... Is that yours? took off just for me... my honour~~~ I haven't saw any game uses more than 4GB VRAM other than Shadow of Mordor high resolution texture pack.Not my dream card, but i really curious if i bought 290X 8GB how much of the extra vram on gaming especially if i ON the MSAA kao kao~~ Also some newer games can't even use MSAA anymore, too demanding already... Have to use SMAA or FXAA now |
|
|
|
|
|
Jan 26 2015, 09:42 PM
Return to original view | Post
#14
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(wongkoewei @ Jan 26 2015, 09:13 PM) COD Advanced warfare got, in the form of 'supersampling', i guess it is a kind of SSAA. Crysis 3 got, watch dog got this option, not using it now... SMAA is different.For me SMAA is the best AA for now. Performance impact is just few percent like FXAA, but it looks really good. I have to say I prefer it over MSAA 4x, MSAA looked sharp. MSAA's performance impact may reach 15-30% Far Cry 4 with SMAA looked awesome. |
|
|
Jan 27 2015, 07:09 PM
Return to original view | Post
#15
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(wongkoewei @ Jan 27 2015, 07:01 PM) although most review telling me other wise that even 4K gaming will not make use of all 4GB VRAM, but my logic telling me otherwise. all the 8GB card review here show nothing like GPU and VRAM usage and i'm not sure if any scenario of 4GB VRAM being stressed. Even if future games may require big VRAMs. 4GB VRAM is sufficient simply by lowering the texture settings One question to ask all pros here, what exactly will HBM impact on the actual performance of card as I'm not sure that current gen 512bit bandwidth has been 100% utilized. if 512bit GDDR5 are yet to be fully utilize, will HBM make any difference? like PCIe 2.0 and 3.0 debate, most result shows that the difference are minimal as most of the card around are not even fully utilize PCIe 2.0 let alone 3.0. Will the same logic applies to the momory bandwidth in GC? Penang Lang wor... Penang good, got LGE LOL!!!!!!!!!!!!!!!!!!!!!!!!!!!! and most important , got very nice FOOD!!!!! |
|
|
Jan 28 2015, 08:32 PM
Return to original view | Post
#16
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
|
|
|
Jan 29 2015, 12:30 AM
Return to original view | Post
#17
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
GPU usage and VRAM usage doesn't really correlate with each other
|
|
|
Jan 29 2015, 12:37 AM
Return to original view | Post
#18
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(wongkoewei @ Jan 29 2015, 12:32 AM) There are so many 4GB cards on the market, I would think that's the target for game developers for years to come.The ceiling for game developer's VRAM requirement when designing a game now would be 2GB, 3GB and 4GB. On rare cases 6GB or 8GB. |
|
|
Jan 29 2015, 10:21 AM
Return to original view | Post
#19
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
When the competitor shot themselves in the foot, AMD doesn't just stay silent but help them shoot too
![]() |
|
|
Jan 29 2015, 10:56 AM
Return to original view | Post
#20
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,943 posts Joined: Apr 2005 |
QUOTE(chocobo7779 @ Jan 29 2015, 10:53 AM) AMD Robert Hallockhttps://twitter.com/thracks I saw the link from a comment someone posted at PCPER |
|
Topic ClosedOptions
|
| Change to: | 0.0462sec
0.20
7 queries
GZIP Disabled
Time is now: 26th November 2025 - 09:06 PM |