Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
3 Pages  1 2 3 >Bottom

Outline · [ Standard ] · Linear+

 AMD Radeon™ Discussion V12, Latest - 14.12 | WHQL - 14.12

views
     
Najmods
post Feb 26 2015, 08:49 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Unseen83 @ Feb 26 2015, 02:48 PM)
smile.gif so there time where AMD fan and Nvidia Fan @r EndUser goes best bang/Budget... soon there be added Fan categories Cross-SLI Fan... 

http://wccftech.com/dx12-nvidia-amd-asynchronous-multigpu/
*
Yeah wccftech is the best source for graphics technology rolleyes.gif DirectX 12 won't change a thing regarding this, one good example; nvidia is actively disabling PhysX when it detect AMD GPU and/or it's not the main active cards when it actually work at one time before nvidia finds it out and kills it. That is only the PhysX portion, and to be able to use cross brand multi GPU solution? Not in a billion years.
Najmods
post Mar 3 2015, 04:34 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(faizal87 @ Mar 3 2015, 12:51 PM)
why many sell their r9 280? got problem ?
*
Why post three times? Got problem? tongue.gif

Most probably just waiting for new AMD card to arrive before the value of their card crash even more.
Najmods
post Mar 5 2015, 12:03 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Acid_RuleZz @ Mar 4 2015, 10:22 PM)
Just check those online bench for MSI R9-290x Gaming. tongue.gif  http://www.3dmark.com/fs/4217000
Max temp on my 1st stress test is 92c. laugh.gif

Couldn't wait anymore bruh. Tido tak lena, mandi tak basah. Lel.

I'm not really a fan of the small fans pun intended and the cooling performance not that great iirc. I like the aesthetic though.
*
The R9 290X Gaming is fitted with old Twin Frozr IV HSF, it wasn't really meant to handle the heat produced by R9 290X but at least it's quiet.
Najmods
post Mar 22 2015, 07:47 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Unseen83 @ Mar 22 2015, 07:14 PM)
eh tearing or joke ?  blink.gif
*
From what I know screen tearing can't be captured through screenshot. And it doesn't look like that either laugh.gif

If any of you feeling brave you can overclock your refresh rate, by using software like cru (custom resolution utility). I got mine to 75Hz, that eliminate screen tearing for me and also improve fast paced gaming
Najmods
post Mar 30 2015, 08:12 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Unseen83 @ Mar 30 2015, 07:10 PM)
add-on: 
    “Quiet Mode” – Bios position one. Switch is in position closest to where you plug in your displays. This mode is designed to optimally suit a gamer that wants to keep a tight lid on acoustics. If you do not play with headphones, you do not have a high end gaming chassis, or your room’s ambient noise level is extremely low this may be the mode for you.
    “Uber Mode” – Bios position two. Switch is in position furthest away to where you plug in your displays
*
That is only applicable to R9 290X and NOT R9 290. You can read the review here. Like terradrive said the switch merely for either legacy BIOS or UEFI BIOS.
Najmods
post Mar 30 2015, 10:51 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Unseen83 @ Mar 30 2015, 10:15 PM)
biggrin.gif  oh my bad... sory men im just noob in this GPU/OC.. so sorry...  i guess 290 hav for fan speed noise..  eh  but on test Stability With Gaming, you agree with that ? biggrin.gif
*
Yup 100% agree. I hate those synthetic load especially furmark. It gives strain to the graphics card, crazy amount of strain that no other games put that much load on it. I can't even run furmark more than a few second it's bizarre that people put an hour load on their card. What are people try to achieve?? doh.gif Just play some games and it'll be sufficient. If there is any issue it should pop up in game.
Najmods
post Apr 13 2015, 06:57 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(terradrive @ Apr 13 2015, 09:40 AM)
Based on TDP:
...
*
TDP stands for Thermal Design Power, it's the amount of heat generated by the components NOT it's power consumption.
Najmods
post Apr 13 2015, 10:10 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(terradrive @ Apr 13 2015, 09:03 PM)
Then isn't it easier to remember that:

Total power consumption = heat generated + actual work done by the component

Then the power consumption can be higher

Bear in mind that there's hardly 280x reference cards, most of them are slightly overclocked like on this gigabyte card.
the 280 is higher clocked version than the 7950 boost.

better safe than sorry, some game's UI is crazy like on AS Unity's menu screen in game, the GPU ramps up to crazy heat and power consumption if didn't turn on Vsync
*
Your equation is wrong. Power consumption depends on load. P=IV, the higher the load the more current the component needs hence higher power usage, even if the voltage stays the same but if the load is low it consume less current hence less voltage. Different games puts different load on GPU. The Maximum graph you put is using synthetic power virus Furmark program which is irrelevant.

AC Unity menu is bad example as it's just that game have that problem.

His most likely problem is vBIOS because he already fixed it after people recommending him to update it. The flickering issue on 280 cards are well known all over the internet.
Najmods
post Apr 13 2015, 10:59 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(terradrive @ Apr 13 2015, 10:42 PM)
Dude, my equation is totally irrelevant with yours. It is totally different thing.

The equation I posted is really basic in terms of engineering. P = IV is describing an entirely different matter.

Furmark is the max power consumption the card can ever get, however one and rare example of AC Unity menu bug can cause abrupt increase of power consumption. Such thing is very hazardous to low end power supplies, some of which can even burnt out because lack of proper overpower overload protection.

Also notice the peak power is at 244w, that's is achieved on crysis 3 IINM.

I saw the posts about the flickering and glad about him fixing it. But the recommended PSU requirements by AMD for 280X is still 750watts, that's far cry of the current 500 watts.
*
It's not an entirely different matter, it's basic electronics. Electrical stills play with current and voltages. The main question is TDP which is still means Thermal Design Power is is NOT equals to power consumed. Also TDP varies from one company to another, AMD rated it's CPU TDP differently from Intel, so does nvidia and AMD GPU.

Nobody 'plays' Furmark', as I said it's irrelevant because some GPU automatically detect this porgram and put a limiter to prevent the card from overheating or using too much power. The peak power is from Crysis 2 which clearly shows on top of the page.

You are delusional. Show me where does AMD stated that R9 280X requires 750W on SINGLE CARD. Read the review on Guru3D on R9 280X TwinFrozr power consumption. They stated what are the recommended PSU down there.
Najmods
post Apr 14 2015, 11:16 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(terradrive @ Apr 14 2015, 09:15 AM)
Who is the one that is delusional here?

First, Assassin's Creed Unity have the menu bug which draws big power from the card, why did you conveniently dropped this from your argument? Answer this.
How long do you stay on menu? Can you see why I say you delusional over one measly game with it's menu bug?

QUOTE(terradrive @ Apr 14 2015, 09:15 AM)
Second, the peak power shown is 239 watt from your link, 244 watt from my link. It is near 250 watt as claimed by AMD. That is not small power figure

Oh hey, how convenient that the page you linked stated this:
user posted image
I guess no need to play safe, just let the PSU burnt out and take some components out is all right and dandy  laugh.gif

I agree but did he mentions any problem from that list? He even say there is problem with Chrome, whic is not game related.

QUOTE(terradrive @ Apr 14 2015, 09:15 AM)
Please, games aren't the only ones draining huge power from GPU.

If you are so sure about the card will automatically throttle because it was designed that way, why don't you help those manufacturers write their BIOS?
gtx 970 & 980 were touted as power sipping GPUs, look at how much power the aftermarket cards use when running cuda. Same thing with Radeons be able to run OpenGL.
Wha? How it comes to this? It's OpenCL on OpenGL. We are discussing games not GPGPU performance or it's power consumption.

QUOTE(terradrive @ Apr 14 2015, 09:15 AM)
Oh hey, how convenient that the page you linked stated this:

I guess no need to play safe, just let the PSU burnt out and take some components out is all right and dandy  laugh.gif

Third, cherry picked MSI card with stated lower PSU requirements? Please.

Wow this is interesting, Asus stated power consumption up to 300 watts

I'm conveniently leaving out Gigabyte 280X which states 600 watts PSU requirments, because it's to follow your style of cherry picking.
And you right now starts to pick every single 280X cards for no obvious reason. The guy with the problem himself saying his card is fine now. I'm still waiting for official AMD saying single R9 280X cards requires 750W PSU like you stated before.

QUOTE(terradrive @ Apr 14 2015, 09:15 AM)
Fourth, in CPU, people doesn't use P = IV. People use a more specific formula that is P = CV2f. If you didn't know then stop talking about current and voltages.
I don't want to go to specifics on equation. I just put the most basic formula.

QUOTE(terradrive @ Apr 14 2015, 09:15 AM)
Look at his power supply, the Gigabyte Powerrock 500w specifications:
2 rails of 12v with 18 amps each, and total of 432 watt delivered stacked on 12 volt rails. So technically his power supply isn't 500 watts. Not to mention 18 amp rail can deliver a theoretical 216 watts of power. That is already too close for comfort. For high end cards that has TDP of 250 watts it is usually recommended to have at least 24 amps on the 12 volt rails. Heck a good quality Seasonic G 450 watt have 37 amps on the 12 volt rails.
*

Agreed that his PSU is crappy but if he say is all good. Why don’t you tell him?

Najmods
post Apr 17 2015, 09:26 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(hardcorie @ Apr 17 2015, 03:09 PM)
i have buy Lenovo z50 AMD FX 7500 R7, i knowledge that the APU radeon use to share system memory and i want to know the another GPU radeon r7 260m dx is have its own memory or shared with system memory?
*
You are right, FX-7500 is an APU with integrated graphics, and the R7 260MDX have it's own 2GB dedicated memory. You can double confirm with GPU-Z.
Najmods
post Apr 18 2015, 10:54 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Acid_RuleZz @ Apr 18 2015, 09:07 AM)
Oh, my bad. I forgot it's a Kaveri. When i see FX i instantly though it doesn't have integrated gpu. ph34r.gif
*

AMD bring FX naming to mobile but they just renaming their Kaveri platform and people confused by it. IMO they shouldnt do it, FX brings bad omen to AMD ever since they use it.
Najmods
post Apr 20 2015, 07:58 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Unseen83 @ Apr 19 2015, 05:01 PM)
if R9 390 out, good price or rm600-900 lower than competitor with new HBM Memory tech smile.gif sure i believe crossfire r9 390 is possible to get Average 50-60fps(or higher) in 4K .
[url=http://wccftech.com/amd-talk-graphics-products-quarte
*

Nah it wont be that much price difference. New cards plus new memory wont be cheap, plus GST.

Freesync is still at its infant state. Wait a few months for it to be more mature. I agree that higher refresh rate isnt that important because one of the reason of it is to achieve fluidity at lower framerate at high resolution such as 4k and above, other than to eliminate screen tearing of course.
Najmods
post Apr 20 2015, 11:36 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(S4PH @ Apr 20 2015, 11:12 AM)
Agreed new gpu will cost a bomb with gst,  plus dx12 will improve older hardware,  then only fx cpu can use up 6 cores,  might just wait n see before changing to newer gpu technology,  but I still game at 1080p no moolah for 4k panel
*

I pretty much going to stick with 1080 for a long time. I didnt find the extra pixel appealing. Good news is because high end card now focused on 4K we on fullHD resolution can opt for mid range cards that are now cater for that resolution, as an added bonus buying high end cards for that resolution means we could crank up all the eye candy ingame, plus the VSR or DSR if needed biggrin.gif
Najmods
post Apr 20 2015, 12:04 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Unseen83 @ Apr 20 2015, 11:43 AM)
hm.. Seller in Malaysia(NOT all of them) alot of them Jack up the price e.g. look at GTX Titan X RM4,xxx+  this is before GST was implemented... is on Lynet NEWSUpdate... if they give out the usual excuses "US Currency Exchange Fluctuation" (even so distro got it goods from china and all product produce in China "feel so suspicious/greedy") it shouldn't be more RM3.9K  AND if Nvidia recommend 999USD does not mean Seller not getting it % cut with selling of base recommended price...
http://www.lowyat.net/2015/03/nvidia-gefor...ore-than-rm-4k/

And for your info AMD always Present  a better competitive price over it's competitor... e.g. GTX 980 and R9 290 performance only 6-9FPS or some game no difference...  but in price... is 100usd - 250usd difference... icon_rolleyes.gif
*
That is the reason I refuse to buy anything brand new after GST. I noticed this stupid trend where even before GST they jack up the price like you mentioned. Not all the shop do this but its annoying when my favorite shop did it.

The reason why AMD R2 290 series is cheap is because it's old. If you open up GPU-Z you can see the announced date. The card is approaching 2 years old (290X released on Oct 2013!!)! AMD cuts their price after GTX 900 series comes out around Sept 2014.


Najmods
post Apr 20 2015, 12:24 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Unseen83 @ Apr 20 2015, 12:06 PM)
i got my on april 2014 RM1.5K Sapphire R9 290X Vapor X smile.gif
*
Good for you, brand new or used? At least during that time where the mining craze is on the rise, AMD cards price isnt affected over here.
Najmods
post Apr 20 2015, 12:28 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(sai86 @ Apr 20 2015, 12:22 PM)
but performance aint old. love my tri-x 290 now.
the only thing to do now is to find rubber and fix the god damn shroud noise  whistling.gif
*

Yup because AMD builds the card with enormous 512-bit memory and 64 ROP the performance is awesome. When you read the review 290 cards handle high resolution with AA better than nvidia cards. Shroud noise? Whats the problem with your card?

Najmods
post Apr 27 2015, 07:24 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Unseen83 @ Apr 26 2015, 11:53 AM)
agree smile.gif  GCN was design to work on DX12.. but M$s take sweet time.. but Fast Track it out when seen Mantle( Vulcan now) as threat to their Window OS(ppl may stop M$ win10 go for FREE option Steam OS)
*

Vulkan is not Mantle, it is the next generation OpenGL with similar goals like Mantle and DirectX 12.
Najmods
post Apr 27 2015, 01:08 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(Unseen83 @ Apr 27 2015, 11:42 AM)
Mantle was used to accelerate the developement of Vulkan, but it isnt Mantle per se.
Najmods
post May 1 2015, 09:38 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(evilme @ Apr 30 2015, 03:47 PM)
benchmark was ok, but, during the test, include 2nd gpu didn't recogzine. how?
*
Have you disable ULPS?

3 Pages  1 2 3 >Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0603sec    0.54    7 queries    GZIP Disabled
Time is now: 4th December 2025 - 07:38 AM