Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

> nvidia gpu doesnt handle async compute well, gimpwork again?

views
     
Quantum Geist
post Feb 25 2016, 10:36 AM

Getting Started
**
Junior Member
109 posts

Joined: May 2013


AMD has async modules, whilst nvidia doesn't in current generation (simulated through software instead). If I'm not mistaken a few PS4 and XBone developers starting to implement async.

Wait for Polaris and/or Pascal. But Pascal maybe a bit slower to release since there are rumors that they have trouble in production.
RevanChrist
post Feb 25 2016, 10:38 AM

New Member
*
Junior Member
32 posts

Joined: Sep 2009
Doesn't matter. Going to jump ship to Polaris/Pascal. Maxwell cuts too many corners for power consumption optimization. Hawaii too power hungry.
SUScrash123
post Feb 25 2016, 10:38 AM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(zerorating @ Feb 25 2016, 10:34 AM)
hint: nvidia contract with S.E, its an nvidia game
they will release the patch when the conracts end.
*
Got any proof?
TSzerorating
post Feb 25 2016, 10:41 AM

Miskin Adab
*****
Senior Member
970 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(crash123 @ Feb 25 2016, 10:38 AM)
Got any proof?
*
do you even heard of non-disclosure agreement, it will hurt both company if they tell public about it.
do you want S.E to tell: we intentionally sabotage AMD, because we want to?

and why are you so defensive?
we shouldn't support company that try hard to holding back new technology that actually helps.

This post has been edited by zerorating: Feb 25 2016, 10:45 AM
unknown_2
post Feb 25 2016, 10:49 AM

On my way
****
Junior Member
571 posts

Joined: Mar 2012


QUOTE(zerorating @ Feb 25 2016, 10:20 AM)
how about current product? can throw to dustbin? disappointed for me as gtx760 owner, now already becoming one of lowest tier in latest game.
*
nvidia is like a hot girl, but the catch is she aged terribly.
AMD is that ok looking girl wit internal beauty.

choose ur girl.
TSzerorating
post Feb 25 2016, 10:49 AM

Miskin Adab
*****
Senior Member
970 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(nyess @ Feb 25 2016, 10:47 AM)
the article is about the new update for the game, now with async compute

your tered is about how nvidia hurr durr gimped and does not handle async compute well

slowpoke la, if you already know about the issue first addressed almost half a year ago, you mana bukak tered mcm slowpoke mia

everybody knows oledi

you'd just read the article and keep the thoughts on yourself

mana sampai bukak tered "eh eh nvidia hurr gimpwork again durr?"
*
we know that nvidia publish that their product do support async compute (even by software), but we didnt heard that it will lose in performance.
SUScrash123
post Feb 25 2016, 10:51 AM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(zerorating @ Feb 25 2016, 10:41 AM)
do you even heard of non-disclosure agreement, it will hurt both company if they tell public about it.

and why are you so defensive?
*
There we go with conspiracy theory. Why im so defensive?coz people spewing bullshit without proof. Just bcoz one game that use dx12 and nvidia suck at it, lets assume that they will suck at all dx12 game. Wait until other game then we can make a conclusion. Im not a nvidia fan. U will see in my post i will recommended r9 390/390x over gtx 970/980. Gtx 980ti over fury x and fury x cf over sli gtx 980ti.

Inb4 nvidia fag
Inb4 nvidia butthurt
TSzerorating
post Feb 25 2016, 11:04 AM

Miskin Adab
*****
Senior Member
970 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(crash123 @ Feb 25 2016, 10:51 AM)
There we go with conspiracy theory. Why im so defensive?coz people spewing bullshit without proof. Just bcoz one game that use dx12 and nvidia suck at it, lets assume that they will suck at all dx12 game. Wait until other game then we can make a conclusion. Im not a nvidia fan. U will see in my post i will recommended r9 390/390x over gtx 970/980. Gtx 980ti over fury x and fury x cf over sli gtx 980ti.

Inb4 nvidia fag
Inb4 nvidia butthurt
*
its common practice for a company to sabotage other company. But im not agree on ripping off their old customer with old product (making a product legacy just in 1years+)
not everyone have rm3000+ to spend for a new graphic card in order for it stay relevant for 3 years.

okay we wait for other benchmark with async compute toggle, but i doubt to see it if gpu from both side can support it.

This post has been edited by zerorating: Feb 25 2016, 11:12 AM
Demon_Eyes_Kyo
post Feb 25 2016, 11:19 AM

On my way
****
Senior Member
682 posts

Joined: Nov 2004



QUOTE(zerorating @ Feb 25 2016, 11:04 AM)
its common practice for a company to sabotage other company. But im not agree on ripping off their old customer with old product (making a product legacy just in 1years+)
not everyone have rm3000+ to spend for a new graphic card in order for it stay relevant for 3 years.

okay we wait for other benchmark with async compute toggle, but i doubt to see it if gpu from both side can support it.
*
GTX760? Same card with me icon_rolleyes.gif . But thats so 2013 card. Mana 1 year+?
TSzerorating
post Feb 25 2016, 11:23 AM

Miskin Adab
*****
Senior Member
970 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(Demon_Eyes_Kyo @ Feb 25 2016, 11:19 AM)
GTX760? Same card with me icon_rolleyes.gif . But thats so 2013 card. Mana 1 year+?
*
gtx760 may have longer shelf life, bit gtx780ti were release on november 2013, gtx 700 series were becoming legacy product in year 2015.
Demon_Eyes_Kyo
post Feb 25 2016, 01:00 PM

On my way
****
Senior Member
682 posts

Joined: Nov 2004



Classifying as legacy doesn't meant its written off bro. Just that they reached a state where drivers are pretty stable and doesnt require a lot of changes. I believe the 780Ti is still very much capable of playing many games at pretty good settings nowadays.

Btw not defending nvidia at any point. Just saying once drivers are stable it will be classified as legacy. The same went with my ATI 4890 last time, after like 1.5 to 2 years it gone legacy as well.
deodorant
post Feb 25 2016, 01:06 PM

Surfing LYN instead of Working.
*******
Senior Member
5,691 posts

Joined: Mar 2006


If thinking of buying comp (looking at upper/mid range like gtx970) issit OK to buy now or wait a bit for upcoming tech?
TSzerorating
post Feb 25 2016, 01:16 PM

Miskin Adab
*****
Senior Member
970 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(Demon_Eyes_Kyo @ Feb 25 2016, 01:00 PM)
Classifying as legacy doesn't meant its written off bro. Just that they reached a state where drivers are pretty stable and doesnt require a lot of changes. I believe the 780Ti is still very much capable of playing many games at pretty good settings nowadays.

Btw not defending nvidia at any point. Just saying once drivers are stable it will be classified as legacy. The same went with my ATI 4890 last time, after like 1.5 to 2 years it gone legacy as well.
*
my history with graphic card, once the product reached legacy, game optimization will not made anymore, only bugfixes or optimization that can be apply on all product making it even shittier than the recent product that having lower specification. Yes I know game graphic budget could be difference, some game may heavily depending on polygon counts and some other may be heavy on shaders or heavy on post-processing and some of them are heavy dependent on memory throughput and memory size. tldr: doesnt mean gpu that are capable to run quake3 at 5000fps will be having better performance on latest game when comparing with the gpu capable to run quake3 at 1000fps

how come gtx760 is weaker than hd7850 for the latest game and they had game ready driver for it.
user posted image

This post has been edited by zerorating: Feb 25 2016, 01:22 PM
dante1989
post Feb 25 2016, 01:20 PM

Getting Started
**
Junior Member
288 posts

Joined: Sep 2010


ayyyy gtx 760 owner here too
jamilselamat
post Feb 25 2016, 01:20 PM

Getting Started
**
Junior Member
145 posts

Joined: Jul 2011


QUOTE(nyess @ Feb 25 2016, 09:51 AM)
you baru tau ke?

pipul ady know since more than 6 months + ago
*
This.

It was the source of dispute between Ashes developer and NVIDIA marketing team.

EDIT: I made a thread about it 6 months ago.

https://forum.lowyat.net/index.php?showtopic=3696104&hl=

This post has been edited by jamilselamat: Feb 25 2016, 01:25 PM
SUStlts
post Feb 25 2016, 01:20 PM

pee poo pee poo
******
Senior Member
1,891 posts

Joined: Apr 2008
From: Cheras
QUOTE(unknown_2 @ Feb 25 2016, 10:49 AM)
nvidia is like a hot girl, but the catch is she aged terribly.
AMD is that ok looking girl wit internal beauty.

choose ur girl.
*
i will fak hot girl then merid internal beauty 1...while still favking hot girl every time

2 Pages < 1 2
Bump Topic Add ReplyOptions New Topic
 

Change to:
| Lo-Fi Version
0.0167sec    0.28    5 queries    GZIP Disabled
Time is now: 26th November 2025 - 11:31 AM