Outline ·
[ Standard ] ·
Linear+
AMD Radeon™ Discussion V9, Latest - 13.11 Beta 9.5 | WHQL - 13.10
|
Desprado
|
Oct 15 2013, 09:09 AM
|
Getting Started

|
QUOTE(Boldnut @ Oct 15 2013, 06:59 AM) because that particular Asrock board does not support 125w FX CPU? http://www.asrock.com/mb/NVIDIA/N68C-GS%20FX/?cat=CPUu want him to try to blow his mobo ar? lol btw this is seriously  thanks to djronzai it is the CPU draw calls they has been talking about. Developers has been making a lot of complains about DirectX inefficient API for years. Mantle if really is close to metal, it would seems be making developer's life easier. Microsoft has been "rather" dormant in DirectX development lately. They seems to focus on Xbox. that is why AMD paid 8 million to DICE to use Mantle upon developer request.If it was developer request than why Amd is paying because it just rubbish and marketing nothing more.Even John Carmack the creator of Linux and Doom game has mantle and Nvidia OpenGL has no difference but both cannot be used. This post has been edited by Desprado: Oct 15 2013, 09:10 AM
|
|
|
|
|
|
Desprado
|
Oct 15 2013, 12:56 PM
|
Getting Started

|
QUOTE(Boldnut @ Oct 15 2013, 11:30 AM) AMD paid DICE is because of the exclusive bundle and optimization. The deal also include the exclusive rights to use Battlefield to promote Mantle API. This is the same as Nvidia paid 5million Ubisoft to get their game optimize for Nvidia card. Mantle is not rubbish, u obviously have poor understanding on software & APIs, how could u relate John Carmark as linux creator, He isnt the creator. Doom use OpenGL, not mantle, Mantle does not exist back then. OpenGL is a high level API. It is design to run best compatibility across diff GPUs. OpenGL is also initially design for professional graphics. On performance wise Mantle is going to be more efficient in every area than DirectX/OpenGL. Ur entire post are wrong totally. In Logic what i am about to say will be hated by AMD fanboy but it is a Fact. Let start No Dev will make Mantle as a first priority because they need to target all gamers.AMD saying they made Mantle on Devs request so where are that Dev and why they are paying so much money to apply that api.If Dev want a low level API so u think they will ask AMD to develop which dont even have a market of GPU and processor. If dev avoid opengl or dx and starting using AMD API than Dev have to bare a huge loss which i think it will be an idiot if they do it because AMD dont have the majority of GPUs user.It is fact Nvidia has the market stock and they 65% of GPU user. This post has been edited by Desprado: Oct 15 2013, 12:57 PM
|
|
|
|
|
|
Desprado
|
Oct 15 2013, 01:06 PM
|
Getting Started

|
QUOTE(sasaug @ Oct 15 2013, 11:08 AM) John Carmack is not the creator of Linux, Linus Torvalds is. John Carmack is the founder of id Software, the famous Quake engine which sort of revolutionise 3D graphic gaming. I can't brain your last sentence but you had to understand how software and hardware work. High level API like DirectX and OpenGL, provides a common interface which the aim could be provide cross platform compatibility(OpenGL) while DirectX is a Microsoft thing. AMD had promised to open up the hardware and create an OpenGL extension which have near Mantle performance. Technically speaking, OpenGL and DirectX can implement Mantle so these API can also gain access and perform faster. Mantle is sort of like the card open low level access which can be used by anybody. You can wrap it around with OpenGL interface so developer don't even need to concern about learning it since OpenGL already do the work for you. You just call your OpenGL function normally and OpenGL will go for Mantle API function instead of the usual route. In Logic what i am about to say will be hated by AMD fanboy but it is a Fact. Let start No Dev will make Mantle as a first priority because they need to target all gamers.AMD saying they made Mantle on Devs request so where are that Dev and why they are paying so much money to apply that api.If Dev want a low level API so u think they will ask AMD to develop which dont even have a market of GPU and processor. If dev avoid opengl or dx and starting using AMD API than Dev have to bare a huge loss which i think it will be an idiot if they do it because AMD dont have the majority of GPUs user.It is fact Nvidia has the market stock and they 65% of GPU user. It cannot be used until AMD pays them in huge amount or else dev will go bankrupt by applying only Mantle.
|
|
|
|
|
|
Desprado
|
Oct 15 2013, 02:47 PM
|
Getting Started

|
QUOTE(Boldnut @ Oct 15 2013, 01:36 PM) ok boss next time need to be extra specific. Lets dont get into this fanboy thing. 1. I already said AMD pay for exclusive rights for BF4 marketing. Nvidia paid 5million to Ubisoft to do the same. AMD are not forcing DICE to use that API exclusively. Infact that API is co-develop by DICE as well, if u want to develop something u have to put some money on it, there is no such thing as free development cost or DICE putting 100% money to develop that API. Thats where the 8million is part of the deal. 2. Nobody is saying they are completely drop DirectX/OpenGL nor they claim Mantle is their top priority. This story is coming out from ur mouth only. Mantle is an add-on feature that only benefit AMD GPU, it is not an API to completely replace DirectX/OpenGL 3. AMD command 100% share in console market. Games has been develop in console market first then only ported to PC for years already. Next gen consoles are x86+GCN base, so there are a lot of similarity between PS4/Xbox vs a PC with AMD GCN GPU. Developer will highly likely to use this API to develop games that is for PS4/Xbox/PC with GCN GPU, then only add a DirectX/OpenGL support for Nvidia/Intel GPU. It is much easier this way instead of dealing with 3-4 diff APIs across 3 platforms. In other words DirectX/OpenGL become the extra work for developer who develop Console ported game that use Mantle. But Console will not use Mantle so the take that word port out. http://www.pcper.com/news/Graphics-C...tible-Xbox-Onehttp://blogs.windows.com/windows/b/a...-direct3d.aspx
|
|
|
|
|
|
Desprado
|
Oct 15 2013, 09:51 PM
|
Getting Started

|
QUOTE(law1777 @ Oct 15 2013, 07:03 PM) any Titan's score??  titan score is around 6xxx to 7xxx http://www.3dmark.com/hall-of-fame-2/3dmar...ion+1.0.5/1+gpuIt is not even near to titan in 3dmark. This post has been edited by Desprado: Oct 15 2013, 09:52 PM
|
|
|
|
|
|
Desprado
|
Oct 16 2013, 01:04 PM
|
Getting Started

|
QUOTE(Acid_RuleZz @ Oct 16 2013, 11:51 AM) Wii U also use AMD GPU but not GCN architecture IIRC. Nvidia gonna reveal probably a new GPU next month at Montreal Canada and there's gonna be a "Super Secret Meeting Livestream" with PCPER on 21st this month.  NVIDIA is developing something BIG and it's not a GPU Read more at http://www.tweaktown.com/news/33025/rumort...KRBMKulaestZ.99i giving ur answer.
|
|
|
|
|
|
Desprado
|
Oct 24 2013, 04:40 PM
|
Getting Started

|
QUOTE(AaronFPS @ Oct 24 2013, 04:03 PM) So actually in normal res to 1440p or 1600p it cannot beat GTX 780 but the price is dam good but as GURU 3D and Aanadtech say it go to 95C and a lot of noise problem.
|
|
|
|
|
|
Desprado
|
Oct 25 2013, 10:14 PM
|
Getting Started

|
QUOTE(stringfellow @ Oct 25 2013, 09:24 PM) Or if you can choose to space the cards out if you have a larger mobo. No more CF connector allows you to do that. I was planning to but what hold me back is that i have 4 way sli mobo and R290X u cannot gap between them so what i meant is that if u CF than u have to use PCIE 1st and and 2nd slot and u cannot use 3rd slot it will not detect what so ever. This post has been edited by Desprado: Oct 25 2013, 10:15 PM
|
|
|
|
|
|
Desprado
|
Oct 25 2013, 10:45 PM
|
Getting Started

|
QUOTE(stringfellow @ Oct 25 2013, 10:31 PM) Is this the restriction of the mobo or CF in general? Not very familar with CF architecture, been on the green side for a while already. New CF.It will not CF if use gaps.It allow u to Gap.Even Guru3d also was shocked wait see here what i meant. http://imageshack.us/scaled/landing/15/g3xu.png
|
|
|
|
|
|
Desprado
|
Nov 7 2013, 11:16 AM
|
Getting Started

|
Although i can cook my food or breakfast on R929xx and there is no need of matches kitchen revelation tnks to amd save time and save money.
|
|
|
|
|
|
Desprado
|
Nov 7 2013, 11:32 AM
|
Getting Started

|
QUOTE(Unseen83 @ Nov 7 2013, 11:21 AM)  coming from Gtx 780... so how it feel to be bend over by nvidia.. are you pregnant ? I am a student and live alone and i cook for my self so i no need to buy matches any more.
|
|
|
|
|
|
Desprado
|
Nov 9 2013, 04:33 PM
|
Getting Started

|
Default [THW] Review Samples vs Retail R9290x boards(large differences) We first observed differences between the Radeon R9 290X cards that AMD sent out for review and the ones being sold online just before our R9 290 coverage went live. After additional testing, we have answers, feedback from AMD, and more questions.
Leading up to the Radeon R9 290X launch, I had two cards in the lab, both supplied by AMD. They ran at slightly different core frequencies in single-card configurations. No big deal, right? That was just PowerTune doing its job. But paired up in CrossFire, even with plenty of space between them, heat build-up forced the pair to drop to even lower clock rates. Right away, I was able to establish that AMD’s Hawaii GPU operates within a range of clock rates, determined by a number of variables. For R9 290X, that range starts at 727 and ends at 1000 MHz. As an aside, I do have an issue with vendors simply advertising this as a 1000 MHz GPU.
The discussion that so badly requires clarification, however, is a derivative of Hawaii’s inherent behavior. Because the GPU is always trying to run as fast as possible, and then adjusting down to obey certain power, temperature, and fan speed settings, every card is going to be just a little bit different. This is expected, and applies also to Nvidia’s GPU Boost-capable cards. But when I got my hands on a retail card purchased from Newegg, it was way slower. Like, 20% slower in many cases.
So that left me wondering: were the press boards just lower-leakage parts able to sustain higher clocks? Without a definitive answer, and with AMD insistent that my results couldn’t be right, I approached the Radeon R9 290 review much more cautiously, presenting data from both the sampled and store-bought 290X cards.
A Break In The Story
Then, on Tuesday, I received word from AMD that it had a smoking gun.
All of my numbers were generated using Quiet mode—AMD’s name for its 40% PWM ceiling, designed to keep acoustics in check. That’s a signal representing a percentage of maximum voltage into the cooling fan. So, from one card to another, 40% should yield roughly the same fan speed. Variance in fan speed is limited to the mechanical components, and should be very small.
But as it turns out, 40% on the cards AMD sent and 40% on the retail cards do not equal the same fan speed. To confirm, I compared the press card to a retail boards from Sapphire and Asus, using our Battlefield 4 benchmark at 2560x1440 and Ultra detail settings.
hail to AMD.
|
|
|
|
|
|
Desprado
|
Dec 11 2013, 12:17 PM
|
Getting Started

|
QUOTE(Acid_RuleZz @ Dec 11 2013, 11:40 AM) Rumor has it, price for a older 7000 series went up because of Litecoin mining. Is it true?  Yes and R2 290,R 290x and R2 280 all went up to $80 to $100 Really hard to believe that actually this card is really good for gaming but most of people is not buying it for gaming but to do business. http://forums.anandtech.com/showthread.php?t=2358139This post has been edited by Desprado: Dec 11 2013, 12:19 PM
|
|
|
|
|
|
Desprado
|
Dec 11 2013, 01:49 PM
|
Getting Started

|
QUOTE(Acid_RuleZz @ Dec 11 2013, 01:47 PM) This is good and bad news for me. I read that the R290 will pay it for self in under 2 months if you mining 24/7. Newegg mostly OOS for 290/290x. Dont worry AMD buyer and user already complained and AMD Roy is seeing this issue and he will stop and limited purchase for business use.
|
|
|
|
|
|
Desprado
|
Dec 11 2013, 02:00 PM
|
Getting Started

|
I now it is of the topic so plz it is just one question what the hell is this bitcoin.
|
|
|
|
|
|
Desprado
|
Dec 11 2013, 02:33 PM
|
Getting Started

|
QUOTE(Vio @ Dec 11 2013, 02:27 PM) what about the price in malaysia? has it gone up as well? damm these miners la.. crashing my hope to get my hands on a non-reference 290... if the market here follows, means 290 will be near to 2k? same price as gtx 780 reference price.
|
|
|
|
|
|
Desprado
|
Dec 11 2013, 02:38 PM
|
Getting Started

|
QUOTE(Acid_RuleZz @ Dec 11 2013, 02:36 PM) Dayyummm... anyone one my 7950 for $800? LOL Now Nvidia is more cheaper wow.I really did not expect that.
|
|
|
|
|
|
Desprado
|
Dec 13 2013, 02:17 PM
|
Getting Started

|
QUOTE(norazwan79 @ Dec 13 2013, 02:14 PM) Mine, sapphire r9 290 + arctic hybrid (2 fans), full load temp 62c, vrm1 71c, vrm2 63c. Not oc yet. No more jet sound.  Do share benchmark.
|
|
|
|
|
|
Desprado
|
Dec 14 2013, 12:51 PM
|
Getting Started

|
QUOTE(Acid_RuleZz @ Dec 14 2013, 12:28 PM) Join the bandwagon and make profit from it.  i am thinking of getting R9 290(4) can any tell will damage my mobo or something beacause of heating issue Plz guide me that how is really Xfire performance of R9 290 and i dont trust reviews.I trust real time experience.
|
|
|
|
|
|
Desprado
|
Dec 14 2013, 01:14 PM
|
Getting Started

|
QUOTE(Acid_RuleZz @ Dec 14 2013, 01:10 PM) I don't think it will damage your mobo because most 4x CFX/SLI with ref cooler also produced high heat when they got packed together. There's no user with 4x 290 here.  i talking about R9 290 Cf performance what really hold me back is this http://www.overclock.net/t/1440600/r9-290-...-with-crossfireThe Crossfire dont work well in past games and some new games as the thread says.
|
|
|
|
|