QUOTE(cT.!Kee! @ Apr 3 2018, 06:30 PM)
Ignorant day was 2 days ago dude. NVIDIA GeForce Community V19, RTX 5000 unveiled
NVIDIA GeForce Community V19, RTX 5000 unveiled
|
|
Apr 3 2018, 07:52 PM
Return to original view | Post
#1
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
QUOTE(cT.!Kee! @ Apr 3 2018, 06:30 PM) Ignorant day was 2 days ago dude. |
|
|
|
|
|
Apr 4 2018, 05:13 PM
Return to original view | Post
#2
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
|
|
|
Apr 5 2018, 04:30 PM
Return to original view | Post
#3
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
QUOTE(craxors @ Apr 5 2018, 11:04 AM) As long as you have a card from after 2006 - which at the bare minimum has HDMI 1.3, you can bitstream DTS-HD MA and Dolby TrueHD to your receiver. Both codecs supports up to 7.1 channels at 48khz. If you're asking about LPCM 7.1, then yes to that as well.HDMI 1.1 it self can already transmit lossy DTS and Dolby Digital which supports up to 5.1 channels. So yeah, unless you have a card from your previous life or something, you're good to go. |
|
|
Apr 5 2018, 05:10 PM
Return to original view | Post
#4
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
QUOTE(craxors @ Apr 5 2018, 04:40 PM) thanks for the highlight.. If your HTIB can accept bitstreaming, then should not be a problem. Otherwise you'll have to send the HDMI signal to your display, then use optical out from the TV to your HTIB. But that will also depend if your TV can send DD/DTS through optical or not.meaning can use HTIB? hdmi from gpu plug in to htib receiver.. and streaming 5.1 sound... yes? This post has been edited by SSJBen: Apr 5 2018, 05:11 PM |
|
|
Apr 6 2018, 04:28 PM
Return to original view | Post
#5
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
QUOTE(craxors @ Apr 6 2018, 09:50 AM) not gonna involve TV. No to option 1. Yes to option 2.i still using my computer monitor for display. objective; PC set + HTIB as sound output. so... GPU <- hdmi -> monitor... monitor <- hdmi -> receiver's hdmi output.... possible? or... if using optical from MOBO direct to optical port on HTIB... workable? still produce 5.1 sound? ![]() But with option 2, you'll have to make sure the bitstream from your motherboard's optical out is either DD or DTS. Not all motherboards support DTS bitstreaming either, so if that's the case then with DTS sources you'll have to re-encode them into DD first (can be done by using Reclock for example). |
|
|
Apr 6 2018, 05:26 PM
Return to original view | Post
#6
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
QUOTE(craxors @ Apr 6 2018, 05:13 PM) noted for option 2. Which is why it's always best to keep it simple.now.. need to confirm the mobo and the htib, both can send and receive dts or not.... if not then *btw.. this is nvidia thread..... For me: GPU > HDMI > Receiver > HDMI > TV + speakers. That's basically it. For video content, I basically only use MPC-HC black with MadVR + LAVFilters, nothing else. LAVAudio is set to bitstream EVERYTHING from the source to the receiver so that it can decode it from there. For games, I have Windows audio set to 7.1 LPCM at 48khz - so basically all games send out raw PCM audio which just works. Of course my receiver also acts as a pre-amp to my external amplifiers for my speakers, but that's another topic. Point is, all my source is connected to the receiver via HDMI which is the key thing here. A simple connection = everything just works. This post has been edited by SSJBen: Apr 6 2018, 05:29 PM |
|
|
|
|
|
Jul 26 2018, 04:29 PM
Return to original view | Post
#7
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
1080ti is only overkill if you don't know how to use it.
1080ti at 165hz at 1080p, now that's not overkill anymore is it? Think. Framerate > resolution. Stop being stuck in the "60hz standard". |
|
|
Aug 14 2018, 04:54 PM
Return to original view | Post
#8
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
Thank goodness it's called 2080 and not 1180.
|
|
|
Aug 15 2018, 03:00 AM
Return to original view | Post
#9
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
Sure, the GTX 2060 and 2050Ti will be a long time away. The RTX 2080 and 2070 though? Yeah, nice shit Jen Hsun.
|
|
|
Aug 20 2018, 06:10 PM
Return to original view | Post
#10
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
Hmmm... the fact that there will also be a RTX 2080+ is odd to me. The only difference (on paper) is the 2080+ will have 3072 (unconfirmed) cores vs 2944 on the regular 2080. That kind of makes no sense. Fabrication yields aren't as good as Nvidia is leading people to believe I think, otherwise there's no reason for the RTX2080 not to have 3072 cores instead.
|
|
|
Aug 21 2018, 02:08 AM
Return to original view | Post
#11
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
RTX 2070: From $499
RTX 2080: From $699 RTX 2080Ti: From $999 Either the Nvidia site is wrong or Jen Hsun is bullshitting. |
|
|
Aug 21 2018, 02:17 AM
Return to original view | Post
#12
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
QUOTE(Muhammad Syukri @ Aug 21 2018, 02:08 AM) Where did you hear it is more powerful than a 1080Ti? All Jen Hsun said is that the 2070 has 5x more Gigarays than a 1080Ti.Gigarays is not a measurement tool lol. |
|
|
Aug 21 2018, 03:59 PM
Return to original view | Post
#13
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
QUOTE(davidletterboyz @ Aug 21 2018, 02:34 AM) This is how nVidia playing their cards. And IMHO is the beginning of the death of Radeon. Agreed.1) Instead of pushing the limit with usual methods like adding # of cores, die shrink and increasing the memory bandwidth, they play with the graphics quality this round, ray tracing. 2) But this ray tracing thing is not do-able with only brute force method. It has to come with the library and API. And nVidia had already begun the groundwork with game engines. This move, I believe is akin to the nVidia's move in GPGPU. It's like putting the first flag to claim a new land. Look at where AMD is today in GPGPU application, especially in deep learning. The hardware is there but the community support in terms of library is almost non existence. CUDA is so far ahead that no community is willing to start all over again with the OpenCL. 3) nVidia use the Gigarays as the new quantifiable benchmarking standard. Like it or not, soon a lot of demos will show the difference without RTX and the Radeon cards will fall in the "ugly" version of the games. If AMD were to follow suit and play the same game, it is definitely behind by at least one gen. If it does not play the same game, they can only sell to budget gamer. 4) The Radeon is doing great in the past two years due to cryptocurrency mining and also game consoles. All game consoles run on Radeon, except the Switch. While it remains to be seen Sony and MS will switch to RTX in next gen (remember nVidia's per-chip loyalty scheme in Xbox and PS3?), it definitely put pressure on them to switch once the PC master race starts to compare the graphics qualities. So there you have it. Low margin, low volume (if lost the next gen console contract). How to fight? Nvidia's "version" of ray-tracing is basically GameWorks bs like PhysX, PureHair, and all the other post-FX that can already be done long time ago. They are essentially leading the blind so they continue to be blind, i.e Apple. RT also has to be hybrid in this case. The problem is, I seriously doubt the next-gen consoles (hate it or love it, that's unfortunately where devs have to mainly develop for) will have any form of ray tracing hybrid equivalent from AMD's side - at least not when people are saying the PS4 at $400 is already too expensive, imagine the PS5 having to deal with the same bullshit. The other issue is, Jen Hsun made no benchmark comparisons (even if it's made up like they always do anyways) to the 1080Ti/Titan Xp. Showing theories and simulations, that's not real world usage. But people throw money at the screen anyways upon hearing RTX2070 has 5x more "jeega-reys" than TitanXp. Pretty sure NDA lifts next week or the first week of September. |
|
|
|
|
|
Aug 21 2018, 05:19 PM
Return to original view | Post
#14
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
QUOTE(terradrive @ Aug 21 2018, 04:45 PM) already have benchmark for comparing turing and pascal on ray tracing performance. Thats why they kept touting 6x faster, that’s 6x in ray tracing speed lol Yeah... guess what, only 3 games as of now was shown to work with RTX. THREE.this rtx is the one that managed to fix the bane of game graphics engines = lighting Any of you who fiddled with 3d modeling (3dsmax, maya, blender) sure know what I meant. Judging by Nvidia's past gimpworks shilling - how many games took advantage of PhsyX, Hair, VoxelAO, etc.? Yeah. There was a list of another 18 games supporting RTX. But looking at those developers (PUBG and ARK lmao), they don't actually instill any confidence that RTX would be as great from a performance standpoint of a view as is led to believe. Sorry, but ray tracing being as mainstream as rasterization isn't going to happen with Turing, hybrid or not. |
|
|
Aug 21 2018, 06:00 PM
Return to original view | Post
#15
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
QUOTE(terradrive @ Aug 21 2018, 05:30 PM) dude, did you tried how those eye candy lighting on the past games? Assassins Creed Syndicate has pretty lighting but enabling those ate 30-40% of my FPS. I think you're misunderstanding me.These are even prettier by a huge mile. Not to mention other applications will make use of it soon such as vrays. Imagine almost real time ray tracing in the preview window. Normally we need to wait like 10 seconds to see it roughly rendered I'm all for ray tracing and I'm well aware of what it does and can do. Hell, I'm more interested in the next advancement of ray tracing which is its sub category of full path tracing. I love PhysX and all the other gimpworks bs. I call it bs not because those features are actually bs, but because they are artificially locked behind an API that nvidia seems to think they created (which they didn't). I don't like circle jerking. What I'm saying is, Turing isn't going to give us all of these without some big performance penalties. |
|
|
Aug 21 2018, 07:51 PM
Return to original view | Post
#16
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
Shadow of the Tomb Raider, RTX 2080Ti with max settings and ray tracing enabled, results? Fluctuations of 30-60fps at 1080p. Yup, 1080p.
http://www.pcgameshardware.de/Grafikkarten...Raider-1263244/ While drivers are probably not matured enough yet, I doubt any "game ready" drivers will provide a significant boost either. This is why I said to keep expectations in check. |
|
|
Aug 21 2018, 08:38 PM
Return to original view | Post
#17
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
Jen Hsun will tell you all up to 6x performance promised because the 1080ti would only run it at less than 10fps!
|
|
|
Aug 22 2018, 02:29 PM
Return to original view | Post
#18
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
And this is pretty much what Nvidia wants. Pascal stocks are still overloaded, they want people to buy pascal not Turing.
Still a win-win situation for them. |
|
|
Aug 22 2018, 06:00 PM
Return to original view | Post
#19
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
|
|
|
Aug 23 2018, 12:13 AM
Return to original view | Post
#20
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,522 posts Joined: Apr 2006 |
Ray tracing in itself is NOT a gimmick. It is the future and it is extremely important to digital 3D content. However, that future is simply not here yet for the masses, not for the amount of money 99.9% of people are willing to shell out for that is.
|
| Change to: | 0.0306sec
0.80
7 queries
GZIP Disabled
Time is now: 27th November 2025 - 07:08 PM |