Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
128 Pages « < 32 33 34 35 36 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V14

views
     
amxpayne67
post Feb 9 2015, 05:32 PM

The Coon
*****
Senior Member
718 posts

Joined: Mar 2010
From: Puchong Extreme



Me from GeForce 4 (2003) -> ATi 9550(2005) -> GF G310M (2009)(Laptop)->AMD HD7850 (2011)-> GF G820M(2014)-> GF GTX 970. Fanboyism? nope. I go whichever offers the best for what i need.
Moogle Stiltzkin
post Feb 9 2015, 05:36 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(terradrive @ Feb 9 2015, 05:08 PM)
4k tv is getting cheaper and cheaper

But 1440p and 4k monitor is expensive zzzz
*
true but from what i read 4k is only going to make a difference depending on the size of your screen.

also most media i have is either dvd, 720p or 1080p, so i don't think it will scale well on 4k resolution, the video quality will be less sharp which is counter intuitive. 4k is definitely the future, but it doesn't help out your older media especially the lower resolution they are, because they will have to scale more to fit a higher resolution. Sure you can change reso from native to downscale but that just reduces quality ... always better to run reso at native closer to the medias reso for a sharper quality image when viewing movies....

4k to me is only going to be feasible when 4k media becomes the norm (anime... tv series... movies...), and for gaming graphics card are powerful enough for this new HD ultra resolution. Right now gpu isn't keeping up with 4k yet especially in regards to 60fps then can forget it (for now anyway) cry.gif

another consideration, 4k netflix requires 25mbps unifi or higher. But i use 10mbps so that is out of the question till broadband prices become more affordable for that kind of speed (which is not going to happen anytime soon unfortunately)

Also if you do get 4k now, by the time 4k becomes the norm, there will be more advanced 4k monitors out on market, so your future proofing to wait till you get to that point was wasted. To me i rather wait for market to mature before opting into 4k once it's stable.




QUOTE(cstkl1 @ Feb 9 2015, 05:15 PM)
Moogle Stiltzkin

Nvidia Can bring down the cost of Gsync Module by finalizing it to ASIC rather than FPGA
sounds great (though honestly donno what that is, cept the price decrease which i'm all for). did they give a timeline when this will happen ? Or is that what we hope will happen ? hmm.gif

This post has been edited by Moogle Stiltzkin: Feb 9 2015, 05:42 PM
cstkl1
post Feb 9 2015, 05:38 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Feb 9 2015, 05:36 PM)
true but from what i read 4k is only going to make a difference depending on the size of your screen.

also most media i have is either dvd, 720p or 1080p, so i don't think it will scale well on 4k resolution, the video quality will be less sharp which is counter intuitive.

4k to me is only going to be feasible when 4k media becomes the norm, and also graphics card are powerful enough for this new HD ultra resolution. Right now gpu isn't keeping up with 4k yet especially in regards to 60fps then can forget it (for now anyway) cry.gif
sounds great. did they give a timeline when this will happen ? Or is that what we hope will happen ?  hmm.gif
*
dude gsync is working right now and optimal fully functioning...

If u were Nvidia ... just have to see how freesync works with a mediocre AMD driver team.

Just waiting for AMD as usual if there is a issue to blame the scaler vendors.


Moogle Stiltzkin
post Feb 9 2015, 05:44 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Feb 9 2015, 05:38 PM)
dude gsync is working right now and optimal fully functioning...

If u were Nvidia ... just have to see how freesync works with a mediocre AMD driver team.

Just waiting for AMD as usual if there is a issue to blame the scaler vendors.
*
that is true.... however from the technical explanation it still isn't convincing. maybe if as you say it was right in front of me i can see still judge...

1. can i live with freesync over gsync (or will i still be bitching about it)
2. how much ?

hmm.gif



my gpu history was....

nvidia < ATI radeon...something...< nvidia < nvidia 8800gts < nvidia 680gtx



i'm also not a fanboy. i go where i see is the winner. I think fanboyism is stupid because as a consumer we should always get best peformance/price ratio product to meet our needs, and not buy just for the sake of a brand rolleyes.gif

though after that 970gtx vram scandal, and ati later came out with a 4gb vram means just that + a price reduction, i can't blame people for getting refunds and switching sides rolleyes.gif (nvidia brought this on themselves)

This post has been edited by Moogle Stiltzkin: Feb 9 2015, 05:51 PM
cstkl1
post Feb 9 2015, 05:54 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(amxpayne67 @ Feb 9 2015, 05:32 PM)
Me from GeForce 4 (2003) -> ATi 9550(2005) -> GF G310M (2009)(Laptop)->AMD HD7850 (2011)-> GF G820M(2014)-> GF GTX 970. Fanboyism? nope. I go whichever offers the best for what i need.
*
Since we are talking about dedicated cards on my main rig ( not including the ones i bought just to test which is more than this)

» Click to show Spoiler - click again to hide... «
and will never go back to ATI

reason is

1. Packaging (yup). Everytime. Paid a lot. Open it and always like hmm.. errr... did i get con kind of feeling.
2. Drivers
3. They always blame other ppl.
4. Hoovers.
5. Expecting them to bankrupt any time now.

but seriously on my whole lifetime of gaming and buying games. Rarely Nvidia Disappoints me compared to Ati on just installing a game and just play. Putting aside sli issues.

Not saying nvidia doesnt have its issue but seriously if money is not an object.. you rarely find a go to gamer having any issue with Nvidia on new game releases.

They are wasting time on Mantle. But i guess they had no choice as their drivers team has a lot of issues with directx11 api so to solve the issue in the future and put it at game developers hands... but ultimately dx12 will be better as microsoft have better ppl working on it.

y my skeptism on freesync. Nvidia has a lot of fantastic programmers. They said it wasnt easy for gsync until hey had to use FPGA.

This post has been edited by cstkl1: Feb 9 2015, 06:10 PM
SUSngkhanmein
post Feb 9 2015, 06:08 PM

カラカラ Karakara
*******
Senior Member
7,727 posts

Joined: Jan 2010
From: Ara Damansara, Petaling Jaya & Batu Pahat, Johor.


QUOTE(cstkl1 @ Feb 9 2015, 05:54 PM)
Since we are talking about dedicated cards on my main rig ( not including the ones i bought just to test which is more than this)

» Click to show Spoiler - click again to hide... «
  and will never go back to ATI

reason is

1. Packaging (yup). Everytime. Paid a lot. Open it and always like hmm.. errr... did i get con kind of feeling.
2. Drivers
3. They always blame other ppl.
4. Hoovers.
5. Expecting them to bankrupt any time now.

but seriously on my whole lifetime of gaming and buying games. Rarely Nvidia Disappoints me compared to Ati on just installing a game and just play. Putting aside sli issues.

Not saying nvidia doesnt have its issue but seriously if money is not an object.. you rarely find a go to gamer having any issue with Nvidia on new game releases.

They are wasting time on Mantle. But i guess they had no choice as their drivers team has a lot of issues with directx11 api so to solve the issue in the future and put it at game developers hands... but ultimately dx12 will be better as microsoft have better ppl working on it.

y my spektism on freesync. Nvidia has a lot of fantastic programmers. They said it wasnt easy for gsync until hey had to use FPGA.
*
totally agreed with ur statement notworthy.gif thumbup.gif we need quality not quantity
zizi393
post Feb 9 2015, 06:14 PM

On my way
****
Senior Member
562 posts

Joined: Nov 2009


QUOTE(SSJBen @ Feb 9 2015, 03:24 PM)
You are only looking at the surface. It's not uncommon that the most recent PC titles are terrible and unoptimized, that even if you have 980/TitanBlack/780Ti or SLI configs, you still can't run the game at its maximum settings.

Games auto detecting "best setting" has NEVER been accurate. I turn SoM on with a 970 SLI config and it detects 720p with medium settings for me on a 1440p screen. If you call that "best", then wow... doh.gif
SoM isn't the only game that does it, pretty much every single game I've played since the last 6 years has this issue.
*
to be fair most game dont come with preset of SLI configuration. most SLI support come several weeks after game release. Majority of consumer use single card.
cstkl1
post Feb 9 2015, 06:17 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ngkhanmein @ Feb 9 2015, 06:08 PM)
totally agreed with ur statement  notworthy.gif  thumbup.gif we need quality not quantity
*
Just look at the titan/780/780ti/980 cooler.

It looks good. It reaffirms before you test it out atleast " Money Well Spent" kindda feeling.

Its the same as going to Massage Parlor ahem getting a HJ for RM50 from Pro Auntie and going to another with Miss Universe for RM150.
So if this anecdote is simple.. why isnt buying gpu the same..
ATI is the former while Nvidia is the later

Look at phones. Apple led in the beginning with design and ask for a premium and still do lol... Android came in with crappy looking phones. Phones now are packaged like iphones. Android kicked apple butt. No more bulky huge boxes.

Its one of the reasons why EVGA / Asus excels at for a long time and now even their low/mid end etc all gets good packaging. Gaygay ( Actually got a warning in OCN for calling gigabyte Gaygay..) has totally revamped itself realizing this. Ppl will pay for it.

Spending money for design and packaging matters.

In that respect i will still recommend the 970 over 290x any day.

btw the full maxwell has a lot of work to do to satisfy the enthusiast level. Everybody is waiting for 4k to be dominated with dual cards at min 60fps with everything turned up. This is no small feat.

Current offerings overclock does it for 1440p. ure gonna need 2.25x the performance of these cards to achieve that. Thats no joke.

signing off.. Gee posted a lot today while waiting for waifu at airport. But this the whole hearted truth on how i feel about ati.
You know the feeling you tell ureself you will never want to be conned/scam again.... thats the reason why i dont buy ati anymore and i doubt they will change.

This post has been edited by cstkl1: Feb 9 2015, 06:28 PM
SUSgogo2
post Feb 9 2015, 06:55 PM

gogo2
********
All Stars
18,672 posts

Joined: Jan 2003
From: Penang


QUOTE(cstkl1 @ Feb 9 2015, 06:17 PM)
Just look at the titan/780/780ti/980 cooler.

It looks good. It reaffirms before you test it out atleast " Money Well Spent" kindda feeling.

Its the same as going to Massage Parlor ahem getting a HJ for RM50 from Pro Auntie and going to another with Miss Universe for RM150.
So if this anecdote is simple.. why isnt buying gpu the same..
ATI is the former while Nvidia is the later

Look at phones. Apple led in the beginning with design and ask for a premium and still do lol... Android came in with crappy looking phones. Phones now are packaged like iphones. Android kicked apple butt. No more bulky huge boxes.

Its one of the reasons why EVGA / Asus excels at for a long time and now even their low/mid end etc all gets good packaging. Gaygay ( Actually got a warning in OCN for calling gigabyte Gaygay..) has totally revamped itself realizing this. Ppl will pay for it.

Spending money for design and packaging matters.

In that respect i will still recommend the 970 over 290x any day.

btw the full maxwell has a lot of work to do to satisfy the enthusiast level. Everybody is waiting for 4k to be dominated with dual cards at min 60fps with everything turned up. This is no small feat.

Current offerings overclock does it for 1440p. ure gonna need 2.25x the performance of these cards to achieve that. Thats no joke.

signing off.. Gee posted a lot today while waiting for waifu at airport. But this the whole hearted truth on how i feel about ati.
You know the feeling you tell ureself you will never want to be conned/scam again.... thats the reason why i dont buy ati anymore and i doubt they will change.
*
WARNING!!! FANBOY detected.

If you buy R9 290X Lightning Edition, you'll get great packaging. Packaging has nothing to do with ATI. Asus also got ATI card. I'm not sure why you blame ATI for packaging. LOL...


terradrive
post Feb 9 2015, 07:03 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(cstkl1 @ Feb 9 2015, 06:17 PM)
Just look at the titan/780/780ti/980 cooler.

It looks good. It reaffirms before you test it out atleast " Money Well Spent" kindda feeling.

Its the same as going to Massage Parlor ahem getting a HJ for RM50 from Pro Auntie and going to another with Miss Universe for RM150.
So if this anecdote is simple.. why isnt buying gpu the same..
ATI is the former while Nvidia is the later

Look at phones. Apple led in the beginning with design and ask for a premium and still do lol... Android came in with crappy looking phones. Phones now are packaged like iphones. Android kicked apple butt. No more bulky huge boxes.

Its one of the reasons why EVGA / Asus excels at for a long time and now even their low/mid end etc all gets good packaging. Gaygay ( Actually got a warning in OCN for calling gigabyte Gaygay..) has totally revamped itself realizing this. Ppl will pay for it.

Spending money for design and packaging matters.

In that respect i will still recommend the 970 over 290x any day.

btw the full maxwell has a lot of work to do to satisfy the enthusiast level. Everybody is waiting for 4k to be dominated with dual cards at min 60fps with everything turned up. This is no small feat.

Current offerings overclock does it for 1440p. ure gonna need 2.25x the performance of these cards to achieve that. Thats no joke.

signing off.. Gee posted a lot today while waiting for waifu at airport. But this the whole hearted truth on how i feel about ati.
You know the feeling you tell ureself you will never want to be conned/scam again.... thats the reason why i dont buy ati anymore and i doubt they will change.
*
But still 290X CF and 980 SLI is quite equal in 4K gaming
SSJBen
post Feb 9 2015, 07:19 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(zizi393 @ Feb 9 2015, 06:14 PM)
to be fair most game dont come with preset of SLI configuration. most SLI support come several weeks after game release. Majority of consumer use single card.
*
Uhm, doesn't matter if a game has a proper SLI profile or not. Majority of the games I've played (and I play a lot since I'm a reviewer) has never given me a proper auto preset, be it single or multi card config.

When a game doesn't have a SLI profile, I can easily inject the proper bits for it through NvidiaInspector, so that's a whole different topic.
McDBigMaC
post Feb 9 2015, 07:40 PM

Casual
***
Junior Member
479 posts

Joined: Jun 2010
Crossing fingers Direct x 12 will dramatically improve our performance.

amxpayne67
post Feb 9 2015, 07:48 PM

The Coon
*****
Senior Member
718 posts

Joined: Mar 2010
From: Puchong Extreme



» Click to show Spoiler - click again to hide... «


That's why i said that i get the best bang for buck for my needs. Not to offend anyone here, but nothing is perfect. That's said, i'm also the victim of VRAM scandal. But at least i get what i expected from the card. sweat.gif
SUSgogo2
post Feb 9 2015, 07:54 PM

gogo2
********
All Stars
18,672 posts

Joined: Jan 2003
From: Penang


QUOTE(McDBigMaC @ Feb 9 2015, 07:40 PM)
Crossing fingers Direct x 12 will dramatically improve our performance.
*
I think it will. If you SLI it, you might get 7GB instead of 3.5GB.
SSJBen
post Feb 9 2015, 08:03 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(gogo2 @ Feb 9 2015, 07:54 PM)
I think it will. If you SLI it, you might get 7GB instead of 3.5GB.
*
ONLY if games uses it. DX12 does not automatically turn all SLI configs into a single card seen by the system.
cstkl1
post Feb 9 2015, 08:12 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(gogo2 @ Feb 9 2015, 06:55 PM)
WARNING!!! FANBOY detected.

If you buy R9 290X Lightning Edition, you'll get great packaging. Packaging has nothing to do with ATI. Asus also got ATI card. I'm not sure why you blame ATI for packaging. LOL...
*
Ignorant person detected.Been buying cards for a ling time. If we wanna talk about cards then be it about reference card as the manufacturer intends to sells it. Reference cards. If anything goes then ln2 will come in play. Buy the top end cards from both camp n till then you wont understand.

Btw scroll up to see my main rig card history.

QUOTE(terradrive @ Feb 9 2015, 07:03 PM)
But still 290X CF and 980 SLI is quite equal in 4K gaming
*
Dont trust benchies. They dont represent gameplay. Not siding to either on this. Dying light for example is very dependent on the preset texture stream buffering which gives 780/780ti sli a smoother gameplay than 970/980.

This post has been edited by cstkl1: Feb 9 2015, 08:22 PM
cstkl1
post Feb 9 2015, 08:18 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(SSJBen @ Feb 9 2015, 08:03 PM)
ONLY if games uses it. DX12 does not automatically turn all SLI configs into a single card seen by the system.
*
Forsee coding issue. Say gpu 2 wants to access gpu 1 memory controller for that bit of vram. N the latency from sli connector.
maybe crossfire will work better since the 290 uses the pcielanes

SUSgogo2
post Feb 9 2015, 08:29 PM

gogo2
********
All Stars
18,672 posts

Joined: Jan 2003
From: Penang


QUOTE(SSJBen @ Feb 9 2015, 08:03 PM)
ONLY if games uses it. DX12 does not automatically turn all SLI configs into a single card seen by the system.
*
True also. I assume most AAA will support since DX12 is available on both nVidia and AMD.

QUOTE(cstkl1 @ Feb 9 2015, 08:12 PM)
Ignorant person detected.Been buying cards for a ling time.  If we wanna talk about cards then be it about reference card as the manufacturer intends to sells it. Reference cards.  If anything goes then ln2 will come in play. Buy the top end cards from both camp n till then you wont understand.
*
Yes, I'm ignorant because I ignore reference card. Most of the time reference card are bare essential. Non-reference card will have more FPS, better cooling and more VRM. thumbup.gif
cstkl1
post Feb 9 2015, 08:38 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(gogo2 @ Feb 9 2015, 08:29 PM)
True also. I assume most AAA will support since DX12 is available on both nVidia and AMD.
Yes, I'm ignorant because I ignore reference card. Most of the time reference card are bare essential. Non-reference card will have more FPS, better cooling and more VRM.  thumbup.gif
*
Agreed in that case a waterchilled volt modded evga classified nvidia card with custom vram timings reigns supreme.

Let me guess ure using a 290x lightning.




alfiejr
post Feb 9 2015, 09:42 PM

Gaming~
******
Senior Member
1,294 posts

Joined: Feb 2012
From: Taman Rasa Sayang, Cheras


QUOTE(pspslim007 @ Feb 9 2015, 09:28 PM)
Hi guys, i just bought a new mobo (Gigabyte G.1 Sniper B5)and a Proc (4690), and i`ve tried to fix myself and finally the system boots up but auto shutdown within 3-4 seconds, so i bring it back to the shop to fix it and apparently he says the bios is out of date, so he went to website and dl the bios and pendrive it, so heres my question how do i ready myself if next time this happens again, is there such thing as old mobo, old Proc that makes this thing happen, please explain. Thanks =X (Sorry if i use the wrong term, beginner here @.@)
*
Don't think it will happen again, the 4690 is a haswell refresh , so you need to update the bios of the B85 mobo first in order for it to work.

128 Pages « < 32 33 34 35 36 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0293sec    0.82    6 queries    GZIP Disabled
Time is now: 29th November 2025 - 06:31 PM