Welcome Guest ( Log In | Register )

28 Pages « < 3 4 5 6 7 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V19, RTX 5000 unveiled

views
     
SSJBen
post Aug 30 2019, 12:51 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Bro Skylinestar, we very well know soundcards are almost pointless these days.

Anyone mildly interested in audio, get an external solution, get a long USB cable and place as far as you can away from your PC. There, pristine audio.
SSJBen
post Nov 7 2019, 03:10 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Guru3D review is wrong. He forgot to account that each time he loads the game in the same place, the world is different (weather, NPCs count and random events occurring). RDR2 is a game that is very difficult to benchmark, in fact no benchmark of the game will ever be accurate due to how dynamic it is.

Also, no benchmark thus far has gone to the china town market at St. Dennis yet. That's the place where it will absolutely bring any 4 core CPUs down to its knees. If it's night time and there's a storm, the GPU will take a beating of its life it'll never forget.

It says how much that these benchmarkers know about the game - which is next to nothing.

This post has been edited by SSJBen: Nov 7 2019, 03:11 PM
SSJBen
post Jan 20 2020, 05:54 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Meh, as long as AMD don't make an effort to improve their drivers - pointless with all that hardware behind it. 5700XT and 5700 are still crashing after 5 months since release lol.
SSJBen
post Jun 25 2020, 05:40 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


HAGS has significant performance issues with PhysX supported games.

Also the latest driver broke HDR with madvr again.
SSJBen
post Sep 2 2020, 03:01 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


The price tells me that there's space for a 3080Ti next year and also a possible 3070 Super (or Ti) if yields become better.
SSJBen
post Sep 2 2020, 04:47 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(mois @ Sep 2 2020, 04:38 PM)
The 3080 dont have HDMI 2.1. If connect to C9, what features am I missing from hdmi 2.1? This gpu perfect for 4k leh.
Only the 3090 have HDMI 2.1
*
All 3 cards announced has HDMI 2.1 ports. Where did you read the 3080 does not have one?

Here's the official page with official specs from Nvidia themselves - https://www.nvidia.com/en-us/geforce/graphi...eries/rtx-3080/
Check the full spec sheet.

The only thing that isn't clear is if HDMI 2.1 controller is the full 48gbps or the reduced 40gbps version. My money is on the 40gbps silicon.
SSJBen
post Sep 2 2020, 04:50 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(defaultname365 @ Sep 2 2020, 04:43 PM)
Yes, definitely. Now waiting for AMD's turn with Big Navi.

I have my doubts though as to how AMD is going to beat Nvidia on both performance + price.

I'm eager to see how AMD brings Ray Tracing.
*
Big Navi will disappoint. AMD has nothing in the bag to fight Nvidia on Ampere's front. They're still playing catch up to Turing. So unless they're willing to take a serious profit cut by pricing Big Navi at 3070's price point with performance above it, there's no chance for AMD to even compete.
SSJBen
post Sep 2 2020, 04:58 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(mois @ Sep 2 2020, 04:53 PM)
Aha! They didnt highlight it on presentation on 3080 but only on 3090.  That why I missed it  laugh.gif

Do you use corsair lapboard for HTPC setup?
*
It would have been very strange if HDMI 2.1 is omitted from the 3080 considering it's their flagship gamer card though.

Honestly, no. I don't like a M&K on the couch. Just using an Xbox one elite controller instead.
SSJBen
post Sep 2 2020, 06:54 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(i7xQTi @ Sep 2 2020, 06:50 PM)
would this cause an issue for most people though
i seem to remember 40 gbps is able to do 4K@120hz, 10bit

im banking on the 3080 to finally be able to do 100+fps on 4k oleds
*
There are no consumer displays that are 4k 12-bit anyway, so it doesn't matter lol. 3080 120fps at 4k should be possible with DLSS I reckon.
SSJBen
post Sep 2 2020, 08:52 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(joeblow @ Sep 2 2020, 07:18 PM)
Sorry noob question. Not sure if this thread is the one to ask. Suppose my PC is only for watching streaming and downloaded shows (mainly 1080 sometimes bluray) playing using media player classic.

Is 3060 coming out anytime? Or will a 3070 be overkill. Mainly future proof and for linking PC via hdmi 2.0 to a LG OLED tv in 4k (1440p?). Actually my question is, will the display on 4k tv or monitor be better with a better graphics card or diff not much?
*
Mainstream Ampere will come out Q1-Q2 next year.

If you're just streaming, then the GPU doesn't matter. But if you're like me who uses the GPU to leverage madvr, then yes it matters quite a bit.


QUOTE(i7xQTi @ Sep 2 2020, 07:22 PM)
depends. for 4k HDR at the moment my 1080ti could handle high settings on madvr but not the 970
for normal displays i think a 1050ti should be sufficient for 4k
*
That's because 970 lacks a 10-bit HEVC decoder and relies on hybrid decoding.


QUOTE(joeblow @ Sep 2 2020, 07:29 PM)
Actually I was not being clear enough. I know the software used to play stream or video more key and whether it can support the hardware gpu capabilities or not. And the TV pic quality, refresh rate etc also plays a part.

Eg, I play Acestream soccer match linking my laptop to the tv, the picture seems jerky and definitely not native 4k. ie TV upscale to 4k.

Suppose I get a better graphic card capable of support native 4k output, will the result be better? Even though 3070 might seem overkill, I think by the time I plan to buy end of this year when new AMD CPU and GPU out, I think 3070 will drop to 2k to 2.5k. Unless 3060 out by then or the Rtx20 series drop price big time, always better to go for the higher tech with smaller chip (7mn).
*
What's happening is that the stream player itself is dropping frames. It has nothing to do with hardware, simply to do with the web player being trash.

And no, if anything 3070 will be even more expensive at the end of the year. Too naive to think prices will drop.


QUOTE(december88 @ Sep 2 2020, 08:16 PM)
Honestly all 4k tv in the market has androidOS, you can just directly get your content from there instead of using your PC. If you insist on pc then integrated graphic more than enough for your need which is streaming, just make sure your motherboard & 4k tv/monitor support HDCP (to stream 4k) and if can support 60hz then even better. I made one built for my friend this year for rm1.6k, future proof 4k@60hz with 512GB SSD, 8GB RAM, ryzen 2200G.
*
No it doesn't. Only Sony, Philips and Hisense TVs uses Android OS.

Your HTPC build is not future proof at all because you don't have a GPU in there that supports AV1 decoding. Only nvidia's upcoming 30 series has AV1 decoding. AV1 decompression is going to be next big thing over h264 because HEVC (aka h265) is royalty based where as AV1 is free. Eventually everything will be switching over to AV1 and that future is not very far away.

SSJBen
post Sep 3 2020, 03:42 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(dexth @ Sep 3 2020, 01:30 PM)
Hey guys, getting ready for ampere to pair with lg c9 hdmi 2.1. now I am confused with cable selection mainly around length and hdcp version.
Length - any issue with length? I know hdmi 2.0 degrade when length more than 2m which I need to use active hdmi cable
Hdcp - all the current hdmi cable sold at lazada is hdcp 2.2. any compatibility issue with ampere hdcp 2.3?

Any advice is much appreciated.
*
Length - HDMI 2.1 @ 48gbps requires at least 22awg copper cores for lengths no more than 3m. Beyond that, the cable has to be amplified by being an active cable or use a repeater. An active cable could go up to 10m, some does up to 15m - there are no set standards for this. Beyond 15m, you'll have to look at fiber optic core HDMI cables, by that I mean REAL fiber optic using glass and NOT plastic or mylar.

HDCP - HDCP is backwards compatible, but not forwards compatible. HDCP has got nothing to do with bandwidth, it's simply a digital copy protection. So if point A (source) and point B (output) are both HDCP 2.3 compliant, it doesn't matter what cable you use because all HDCP does is activate the handshake flag in the data signal.


Also one more thing to add - there is no such thing as a HDMI 2.1 cable per se. There is only a HDMI cable capable of passing through HDMI 2.1 bandwidths or not, that is all. If a packaging says HDMI 2.1 version cable, that means it's not been verified to pass it. This is why some thick and huge garden-hose like HDMI cables from say 2010 could passthrough HDMI 2.0 just fine eventhough it's not a "HDMI 2.1" cable. The cable is simply well built with high gauge copper cores (likely 18awg or even lower) which allows the signal not to drop due to the low resistance, low capacitance and high inductance.

This post has been edited by SSJBen: Sep 3 2020, 03:46 PM
SSJBen
post Sep 3 2020, 11:31 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(hashtag2016 @ Sep 3 2020, 09:56 PM)
I guess so, maybe SUPER would take the place , if neccessary.. drool.gif
*
I'd be very surprised if they release a 3070 SUPER/Ti by the end of the year. Samsung doesn't have the yields necessary to push out GPUs rapidly until next year.
SSJBen
post Sep 4 2020, 06:00 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(sai86 @ Sep 4 2020, 04:45 PM)
WTA, which brand & model of 3xxx series is using Reference PCB?

I stumble this on EK WB for 3080 and 3090.

This block is made for reference PCB 3080 and 3090 cards. Founders Edition does NOT have a reference PCB.

An EK water block for Nvidia GeForce RTX 3080 and 3090
EK-Quantum Vector RTX 3080/3090 D-RGB water block is compatible with most reference design (not Founders Edition) GeForce RTX 3080 and 3090 based graphics cards, but as always, we recommend that you refer to the EK® Cooling Configurator for a precise compatibility match.

https://www.ekwb.com/shop/ek-quantum-vector...gb-nickel-plexi

my current Palit 1080ti is FE with WB for reference board  hmm.gif
*
MSI Ventus for example is "reference" PCB.
SSJBen
post Sep 4 2020, 06:20 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(sai86 @ Sep 4 2020, 06:13 PM)
o.O, how to know such board with custom cooling is Reference board?
I can't find any on the spec sheet for 3080 Ventus.

From the look of this WB, it's so small compare to the Ventus.

user posted image
*
You're looking at the cooler. The actual PCB is shorter than the cooler's length.

Not saying that the Ventus will fit on this EK block though.


You can see there's a good 2 inches of space between the end of the PCB and the end of the cooler:
» Click to show Spoiler - click again to hide... «


This post has been edited by SSJBen: Sep 4 2020, 06:21 PM
SSJBen
post Sep 7 2020, 03:34 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(edmund_yung @ Sep 6 2020, 11:08 PM)
Yeap IMO the IO only solve the speed of loading high-res texture, there tons of other stuff the CPU have to process. Look at that CSGO scaling up to necessary FPS if you keep giving it faster CPU... without highres texture. Hope that with the arrival of next gen console, developer will start to fully utilize 8 cores CPU.
*
High-res textures is just a small part of it. Theoretically, there's very little high res textures in games today. 4k or 8k textures aren't high res really. Textures are usually done in 16k or higher now a days, then it's losslessly compressed down (most of the time at least).

CPUs have to deal with draw calls. More cores gives it better capability in doing so, but it also has to be efficient in doing it.

This post has been edited by SSJBen: Sep 7 2020, 03:36 PM
SSJBen
post Sep 7 2020, 07:09 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(cstkl1 @ Sep 7 2020, 04:18 PM)
What be cool is ppl from asia band together snd boycott them. But nah.. asians are not united and like to f each other.. so dat day will nvr happen. Asus knows dis. So they just lol.
*
Couldn't have said this last line better my self. Well said.
SSJBen
post Sep 17 2020, 04:30 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(kskoay @ Sep 17 2020, 08:20 AM)
Pcie gen3 vs gen4 on rtx3080, overall intel with pcie gen3 still slightly better than amd pcie gen4 for current games on FPS

https://youtu.be/xmhpPhm3P4k
*
Those numbers are still within margin of errors.

The more important thing to answer is when DirectStorage launches next year, what will the difference be like with RTX IO on PCIE 3 vs 4. That's where the more important question to ask.

This post has been edited by SSJBen: Sep 17 2020, 04:32 PM
SSJBen
post Sep 18 2020, 04:31 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(cstkl1 @ Sep 18 2020, 12:42 PM)
1. psu single rail doesnt matter
2. the issue is not power delivery but the quality of cable in handling the heat..
*
This +100.
Especially @bolded no.2.

I can't believe people are still taken for a ride with PSUs giving you guys puny 18 or 20awg cables for the 12v rails. doh.gif

Be smart guys, 16awg is the minimum and if you really want to have proper headroom - just make your own cables with 12awg copper.
SSJBen
post Sep 18 2020, 04:35 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Pretty impressed by the Asus TUF 3080. That PCB design is pretty elaborate considering it's their base/entry model. I'm also surprised they put a proper heatsink with fins for the memory and VRM, MSI gaming X trio which is priced higher don't even have that lmao.

Speaking of MSI, knew their graphene backplate was bullshit and fake marketing. Like why the F would you want graphene as a backplate, it serves no purpose eventhough it has high tensile strength but it's pointless added cost.
SSJBen
post Sep 18 2020, 04:53 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(munak991 @ Sep 18 2020, 04:42 PM)
You can do with 12 awg cable @_@?
i have 100 Feet 12 awg Oxygen free copper cable for Hifi from Monoprice
*
Yes you can and I did it for my 8-pin CPU cable as the stock cable was too short to reach the top of the case with the PSU mounted at the bottom. Yeah, I used my leftover reel of belden 5000UE to do so.

People tend to forget that under those sheaths and jackets, everything is copper. The copper doesn't know or care what it's dressed up with lol, it's simply a conductor from point A to point B.

28 Pages « < 3 4 5 6 7 > » Top
 

Change to:
| Lo-Fi Version
0.0348sec    1.17    7 queries    GZIP Disabled
Time is now: 30th November 2025 - 07:19 PM