Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
12 Pages  1 2 3 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V14

views
     
cstkl1
post Feb 9 2015, 05:15 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(terradrive @ Feb 9 2015, 05:08 PM)
4k tv is getting cheaper and cheaper

But 1440p and 4k monitor is expensive zzzz
*
those 4k tv is only 30hz dude.

Moogle Stiltzkin

Nvidia Can bring down the cost of Gsync Module by finalizing it to ASIC rather than FPGA

SSJBen
The lawsuit will fail as ultimately it was priced way lower and the option can be given for tradeup which a lot of those ppl cant afford for a 980. Or a downgrade to a possible 960ti with full 4gb.

If you can win against this.. then how Apple 8gb/16gb/32gb/64gb and other ssd/hdd makers etc on the wrongly used terminological ...

If dual cards can get away with having written 8gb example on 295x2 and titan X 12gb which is true.. cause its on the cards... so really dont see how this will play out cause ultimately the price was way lower than a 980 and it reflects on its performance figure.

The card works as it should for the price that ppl paid for it.

For SOM.. seriously at 1440p 4gb aint enough.

Tested on a 980 system already vs Mine. Even Dying light is much smoother even-though the texture stream is capped at 3614mb only.


This post has been edited by cstkl1: Feb 9 2015, 05:28 PM
cstkl1
post Feb 9 2015, 05:31 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(antaras @ Feb 9 2015, 05:17 PM)
Perfect for those "cinematic" feeling according to some game makers. LOL~
*
Limitation for HDMI standards.

HDMI 2.0 is not widely available.
Even gpu's only afaik is the current maxwell's only..



cstkl1
post Feb 9 2015, 05:38 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Feb 9 2015, 05:36 PM)
true but from what i read 4k is only going to make a difference depending on the size of your screen.

also most media i have is either dvd, 720p or 1080p, so i don't think it will scale well on 4k resolution, the video quality will be less sharp which is counter intuitive.

4k to me is only going to be feasible when 4k media becomes the norm, and also graphics card are powerful enough for this new HD ultra resolution. Right now gpu isn't keeping up with 4k yet especially in regards to 60fps then can forget it (for now anyway) cry.gif
sounds great. did they give a timeline when this will happen ? Or is that what we hope will happen ?  hmm.gif
*
dude gsync is working right now and optimal fully functioning...

If u were Nvidia ... just have to see how freesync works with a mediocre AMD driver team.

Just waiting for AMD as usual if there is a issue to blame the scaler vendors.


cstkl1
post Feb 9 2015, 05:54 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(amxpayne67 @ Feb 9 2015, 05:32 PM)
Me from GeForce 4 (2003) -> ATi 9550(2005) -> GF G310M (2009)(Laptop)->AMD HD7850 (2011)-> GF G820M(2014)-> GF GTX 970. Fanboyism? nope. I go whichever offers the best for what i need.
*
Since we are talking about dedicated cards on my main rig ( not including the ones i bought just to test which is more than this)

» Click to show Spoiler - click again to hide... «
and will never go back to ATI

reason is

1. Packaging (yup). Everytime. Paid a lot. Open it and always like hmm.. errr... did i get con kind of feeling.
2. Drivers
3. They always blame other ppl.
4. Hoovers.
5. Expecting them to bankrupt any time now.

but seriously on my whole lifetime of gaming and buying games. Rarely Nvidia Disappoints me compared to Ati on just installing a game and just play. Putting aside sli issues.

Not saying nvidia doesnt have its issue but seriously if money is not an object.. you rarely find a go to gamer having any issue with Nvidia on new game releases.

They are wasting time on Mantle. But i guess they had no choice as their drivers team has a lot of issues with directx11 api so to solve the issue in the future and put it at game developers hands... but ultimately dx12 will be better as microsoft have better ppl working on it.

y my skeptism on freesync. Nvidia has a lot of fantastic programmers. They said it wasnt easy for gsync until hey had to use FPGA.

This post has been edited by cstkl1: Feb 9 2015, 06:10 PM
cstkl1
post Feb 9 2015, 06:17 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(ngkhanmein @ Feb 9 2015, 06:08 PM)
totally agreed with ur statement  notworthy.gif  thumbup.gif we need quality not quantity
*
Just look at the titan/780/780ti/980 cooler.

It looks good. It reaffirms before you test it out atleast " Money Well Spent" kindda feeling.

Its the same as going to Massage Parlor ahem getting a HJ for RM50 from Pro Auntie and going to another with Miss Universe for RM150.
So if this anecdote is simple.. why isnt buying gpu the same..
ATI is the former while Nvidia is the later

Look at phones. Apple led in the beginning with design and ask for a premium and still do lol... Android came in with crappy looking phones. Phones now are packaged like iphones. Android kicked apple butt. No more bulky huge boxes.

Its one of the reasons why EVGA / Asus excels at for a long time and now even their low/mid end etc all gets good packaging. Gaygay ( Actually got a warning in OCN for calling gigabyte Gaygay..) has totally revamped itself realizing this. Ppl will pay for it.

Spending money for design and packaging matters.

In that respect i will still recommend the 970 over 290x any day.

btw the full maxwell has a lot of work to do to satisfy the enthusiast level. Everybody is waiting for 4k to be dominated with dual cards at min 60fps with everything turned up. This is no small feat.

Current offerings overclock does it for 1440p. ure gonna need 2.25x the performance of these cards to achieve that. Thats no joke.

signing off.. Gee posted a lot today while waiting for waifu at airport. But this the whole hearted truth on how i feel about ati.
You know the feeling you tell ureself you will never want to be conned/scam again.... thats the reason why i dont buy ati anymore and i doubt they will change.

This post has been edited by cstkl1: Feb 9 2015, 06:28 PM
cstkl1
post Feb 9 2015, 08:12 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(gogo2 @ Feb 9 2015, 06:55 PM)
WARNING!!! FANBOY detected.

If you buy R9 290X Lightning Edition, you'll get great packaging. Packaging has nothing to do with ATI. Asus also got ATI card. I'm not sure why you blame ATI for packaging. LOL...
*
Ignorant person detected.Been buying cards for a ling time. If we wanna talk about cards then be it about reference card as the manufacturer intends to sells it. Reference cards. If anything goes then ln2 will come in play. Buy the top end cards from both camp n till then you wont understand.

Btw scroll up to see my main rig card history.

QUOTE(terradrive @ Feb 9 2015, 07:03 PM)
But still 290X CF and 980 SLI is quite equal in 4K gaming
*
Dont trust benchies. They dont represent gameplay. Not siding to either on this. Dying light for example is very dependent on the preset texture stream buffering which gives 780/780ti sli a smoother gameplay than 970/980.

This post has been edited by cstkl1: Feb 9 2015, 08:22 PM
cstkl1
post Feb 9 2015, 08:18 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(SSJBen @ Feb 9 2015, 08:03 PM)
ONLY if games uses it. DX12 does not automatically turn all SLI configs into a single card seen by the system.
*
Forsee coding issue. Say gpu 2 wants to access gpu 1 memory controller for that bit of vram. N the latency from sli connector.
maybe crossfire will work better since the 290 uses the pcielanes

cstkl1
post Feb 9 2015, 08:38 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(gogo2 @ Feb 9 2015, 08:29 PM)
True also. I assume most AAA will support since DX12 is available on both nVidia and AMD.
Yes, I'm ignorant because I ignore reference card. Most of the time reference card are bare essential. Non-reference card will have more FPS, better cooling and more VRM.  thumbup.gif
*
Agreed in that case a waterchilled volt modded evga classified nvidia card with custom vram timings reigns supreme.

Let me guess ure using a 290x lightning.




cstkl1
post Feb 10 2015, 12:49 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(goldfries @ Feb 10 2015, 11:56 AM)
TDP != power draw

Interestingly nVidia stats 120w for GTX 960 and I managed to get the card to reach of exceed 120w power draw.

AMD on the other hand, if it's a 250w TDP card, takes me so much effort and I can't psu hit to 250w. sad.gif My R9 280X + R9 280X + FX-9590 configuration I expect 250w + 250w + 200w = 700w, with wall draw around 800w but want to breach 600w wall draw also difficult. biggrin.gif
*
Eh curious how you managed to max out the powerdraw of the card. Nvidia has better driver api to max the performance of their card ever since the wonder driver.. You need to account for that

Best without throttling is 3dmark11 gpu test one custom all max with AA disabled. Warning to others on this test. Even wc furmark/folding temp only 41c. This hits 50-55c after few loops.

This post has been edited by cstkl1: Feb 10 2015, 01:00 PM
cstkl1
post Feb 11 2015, 01:35 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(alfiejr @ Feb 11 2015, 12:20 AM)
Having error when opening geforce experience. So please dont download the driver yet. It's bugged  doh.gif  .Performance also slightly worse than previous driver  sad.gif
*
performance no diff for me.. slight bump. but geforce experience is bugged at the driver tab. uninstall , reinstall. as long you dont touch the driver tab it works...

must be some coding issue somewhere as this happened before even i installed the new driver.

cstkl1
post Feb 11 2015, 01:46 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(goldfries @ Feb 11 2015, 01:39 AM)
need feedback from you guys about Shadowplay - when recording is in progress, how bad is the framerate drop?
*
10-20% on single card 5-10% on dual.. thats minimum and depends on ure VRAM sometimes whether its hitting close to the max.

This post has been edited by cstkl1: Feb 11 2015, 01:46 AM
cstkl1
post Feb 11 2015, 02:35 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(goldfries @ Feb 11 2015, 01:51 AM)
Looks like it is less resource hogging when compared to FRAPS.

What's the recorded video size? The ones from FRAPS is huge which is a problem when one games on SSD.
*
Afaik its based on length.

60fps is limited to 1080p.

Fraps is big because its not using h264 avc encode

Btw some games you may not see any drop in fps.

This post has been edited by cstkl1: Feb 11 2015, 02:36 AM
cstkl1
post Feb 11 2015, 10:58 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

looks like they already fixed the geforce experience issue etc..

updated to latest beta also no issue.


cstkl1
post Feb 11 2015, 11:02 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(goldfries @ Feb 11 2015, 10:58 AM)
https://mirillis.com/en/products/action.html

Smooth Windows Aero HD desktop recording, easy tutorial creation!
Action! defines a new standard of performance
and user experience for real-time desktop
recording software. Record smooth high framerate
HD videos of your dekstop and applications activity,
add microphone audio commentary and create great tutorials with ease!

--------

so, the above advertised feature is not part of it?

on the features tab it says "30bit Windows desktop recording".

and

Video recording modes
- Games & applications
- Active screen
- Active desktop region
*
Does it have overhead or totally based of from gpu proprietary tech
cstkl1
post Feb 14 2015, 07:52 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

Just needed to share this
Laughing like mad n almost fell from my stool just now


cstkl1
post Feb 16 2015, 05:23 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

doubt there will be 980ti unless 390 is a marginal upgrade.

Next should be the full Maxwell. But they are running to a few issues atm afaik.


cstkl1
post Feb 23 2015, 06:33 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

Most probably a usd 12 per person rebate or a offer to step up to ref 980(at rsp) n buyback at invoice price or ref 970 price..whichever is lower.

Only for US purcahase.

Then screw everybody who takes up that offer if they rma by enforcing the full conditions of warantty.

This post has been edited by cstkl1: Feb 23 2015, 06:36 PM
cstkl1
post Feb 27 2015, 06:28 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(defaultname365 @ Feb 27 2015, 10:48 AM)
The future of gaming will be gamers wearing 'headgears' bobbing their head around.  tongue.gif  Never tried VR so... 

Anyways, looking forward to the announcement, should be accompanied by some awesome showcase of some game title.

On a second note, yikes on the lawsuit and stuff...  icon_question.gif  My take on it is that customers are always right and they paid for what they believe is 'proper 4.0' and not '3.5 + 0.5'. The big boss saying there was a miscommunication and it is actually a feature just added more fuel to the fire.

Sadly, it is not going to be an easy way out for Nvidia on this... unless some kind of recall/refund/compensation/offer on the next GPU purchase perhaps is provided. Easier said than done for sure.

And even sadder is that the 980 Ti has been pushed to 2016.

http://www.ecumenicalnews.com/article/next...d-to-2016-28355

I am just hoping for at least a 6GB variant of Maxwell...  laugh.gif pointless at 1080p but I plan to jump to 4k. Pretty sure in future even at 1080p higher than 4Gb VRAM usage will be in place.
*
Read more y its a feature dude n y its a breakthrough before posting.

cstkl1
post Feb 27 2015, 07:28 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(defaultname365 @ Feb 27 2015, 07:06 PM)
Lol, I know...  I stand by what I said.  smile.gif

Let's say a car is advertised to run at 400-HP. When it starts going beyond 350-HP (e.g. 360-HP), it starts to slowdown tremendously. So how is it that it is advertised as a 400-HP car? It is supposed to run just fine at over 350-HP.  laugh.gif 

So whatever "feature" it is, breakthrough or not, it is not making users happy. The so-called "feature" at the end of the day is not mentioned to the user upfront and they feel cheated for what they believed they paid for.
*
No not that.

http://www.pcper.com/reviews/Graphics-Card...tations-GTX-970

Read dude.

Understand why this is actually a first. Before this it was never possible. By design following previous gpus including those from AMD.... GTX970 should have been a 3gb card.

So its a feature.. cause its a breakthrough. When you disable the L2 you take off the ROP and MC with it. and afaik normally it a Even number.



This post has been edited by cstkl1: Feb 27 2015, 07:33 PM
cstkl1
post Feb 27 2015, 07:53 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(defaultname365 @ Feb 27 2015, 07:50 PM)
It's a feature, sure. A breakthrough, sure. A first, sure.

But why did Nvidia keep quiet for months only for one user to point out the oddity and thus a cascade of in-depth look into how the 970 was built?  hmm.gif 

Point is, they should have been upfront. This feature/breakthrough could no doubt benefit future GPUs; conventional way providing just e.g. 7Gb VRAM, but with this method, able to bump the GPU to have "8Gb VRAM" with the last 0.5Gb being slower. But DO mention it upfront instead. Not all gamers pour over Nvidia architecture slideshows to learn how their GPUs are built.

We are dealing with an issue of misinformation here to customers who bought what they 'believed' in and there is no way to defend Nvidia.

Good job on the feature, bad job on relaying the information.
*
Dude ure just now digressing to the misled saga.

I was addressing ure remark about feature. Atleast now u know

Simple question does 295x2 have 8gb.. Titan z 12gb n is it written on the box. Lets not even go further back on all dual gpu single card back to 7950gx2 n 3870x2. Is there any mention on their spec only half is useable.

In retrospect the 970 does have a useable 4gb. So why nobody is harping about it before??

The 970 for what the current feature now is inline with the price. A lot of ppl were happy initially thinking nvidia was stupid by selling a card cheaper by 40% with same amount of rop/l2 n full 4gb of a 980.

The ones i pity is those who slied for 4k. This guys were truly misled n not getting the performance they expected.

Rest single card @1080p user wont see any diff.

This post has been edited by cstkl1: Feb 27 2015, 08:20 PM

12 Pages  1 2 3 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0220sec    0.62    7 queries    GZIP Disabled
Time is now: 26th November 2025 - 02:46 PM