Welcome Guest ( Log In | Register )

48 Pages « < 22 23 24 25 26 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V19, RTX 5000 unveiled

views
     
terradrive
post Dec 13 2020, 10:41 AM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(cstkl1 @ Dec 13 2020, 09:48 AM)
Mix bag afaik. Like you only get global illumination with psycho mode.

Some ppl complained about the amd greyout.
I think its because of the rt presets simce its tied to various settings.
*
dynamic global illumination is the future of real time lighting hahhah

QUOTE(targon @ Dec 13 2020, 10:28 AM)
It's the true GSYNC hardware module and not those so called Compatibility stuff that makes the difference.
*
yeah mine had gsync module, what other current monitors that have gsync modules?
terradrive
post Dec 14 2020, 06:48 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(Bonchi @ Dec 14 2020, 06:17 PM)
anyways 1920Mhz@0.9v is kinda the sweet spot. to get 230W~ draw and 5-10C temp drop.

Can really just follow and start with nrw's numbers and work your way up or down once you're familiar with it  thumbsup.gif
*
So awesome, I set mine at that to try out on my 3070. Got overclocked performance at more than 10C temp drop.

At it's heart ampere is actually really efficient chip

This post has been edited by terradrive: Dec 14 2020, 06:50 PM
terradrive
post Dec 14 2020, 06:52 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(Bonchi @ Dec 14 2020, 06:51 PM)
This guide is for 3080... if im not mistaken, 3070 usually can handle higher clocks at 0.9v so you can go experiment on the limits of your card for more performance or reduce the voltage by keeping at 1920mhz for even lower temps. brows.gif
*
haha, i very satisfied with current performance, maybe play-play later. Even at 1920Mhz @ 0.9v, it's so much lower than how the card increased it's power draw previously.
terradrive
post Dec 19 2020, 06:31 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(Melvin117 @ Dec 19 2020, 02:53 PM)
Cyberpunk 2077 is extremely life-like beautiful and a great demonstration of ray tracing

The best thing is it's not all forest and jungle again and again but a stunning, futuristic city
Something completely new  thumbup.gif 

user posted image

Like man, it's so obvious when RT works and the world is just REAL  brows.gif

RT and the crazily vast world make exploring the world so much fun and fascinating, even though a big part of the gameplay is pretty repetitive and UBISOFT
But my god, I'll confess if ubi starts making their games as detailed and decent as Cyberpunk 2077 Imma play the hell out of them
Night City is such a beautiful canvas to load all the contents on it, and CDPR sure loaded alot on it  icon_idea.gif

First thing first making the game looking so great is the RT reflection, man it's simply a different game WITHOUT RT REFLECTION
The world is DULL without RT reflection

Next, turning on RT lighting actually adds more shadows on top of RT shadows *im kinda becoming like digital foundry now haha*

ie. the main quest where Johnny brings you to Hotel Pistis Sophia, if you turn on RT lighting esp pyscho/GI you can see light diffuses through the windows and forms a beautiful, nostalgic atmosphere
If you turn off RT lighting the shadow from the window simply disappears and it's a normal hotel room instead of a beautiful backdrop to tell Johnny's story

user posted image

Parts like these really add to the story-telling and atmosphere and make me understand why most angry players are angry: they simply aren't playing the same game as we do

which is sad but also fuck yeah, PCMR  icon_rolleyes.gif
I can say getting the 3080 is worth just for this game, the other games now all feel last gen esp ubicraps
RDR2 is still beautiful, but on the 3080 it runs pretty much like it did on my Xbox One X which is kinda the same case with HZD, DS and other console exclusives

p/s: pardon me for no more screenshots cause I mainly play the game in HDR and all me screenshots are gray-ish shots lmao
*
The reflections is awesome because it has roughness texture map that controls reflectivity on many different materials such as tiles, as opposed to overused glass and water reflections. Then again the glass reflection on CP2077 has different IOR values across the glass too, so the reflection doesn't look flat
terradrive
post Dec 19 2020, 07:23 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(Melvin117 @ Dec 19 2020, 07:10 PM)
It certainly a technical piece compared to Control which looks only "reflective" in the pyramid maze or ghostrunner which everything looks ultra "oily'

This feels realistic and really adds to the mood of the surroundings
*
Because reflectivity is controlled by roughness on ray tracing renders. It is very common on 3d renders on programs such as 3dsmax, maya, cinema4d, blender etc. It is the reflections that should be inserted into real time graphics thumbsup.gif
terradrive
post Dec 20 2020, 08:26 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


Nobody posted this yet?




Same game assets, just plain improvement just from lighting. The last part in the car is really next generation difference.
terradrive
post Dec 20 2020, 09:21 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(FatalExe @ Dec 20 2020, 08:36 PM)
If you read the comments on Reddit and YouTube and etc, it seems ray tracing is now a political feature . Anyone who talks about ray tracing gets labelled as Nvidia paid shills, because AMD told them it's not important.

There are actually people with 100+ upvotes accusing Digital Foundry of being paid by NVIDIA for making too many ray tracing videos.

AMD and the other YouTubers laughing to the bank with the gaming community

In case someone gets offended, I'm not being very serious ^
*
really crazy, these people just refused to acknowledge ray tracing is the future of real time graphics
terradrive
post Dec 20 2020, 11:03 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(Cold|Drawn @ Dec 20 2020, 11:02 PM)
thing about RT is, will other studios put as much effort as CDPR to make RT a reality?
*
it'll be easier as time goes, nothing beats easier optimization when you can just brute force it via GPU power (future gpu performance upgrades)

We already brute forced rasterization 4K ultra settings, RT is the next step up to get vastly improved scenes without changing the game assets.

This post has been edited by terradrive: Dec 20 2020, 11:06 PM
terradrive
post Dec 21 2020, 01:35 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(Melvin117 @ Dec 21 2020, 12:54 PM)
I won't blame them cause tbf Cyberpunk is the first game where ray tracing on is REALLY NOTICEABLE and different than RT off

But now since they can experience it by themselves and the experience is not hard to come by, maybe they should play it before speaking nonsense

after all, RT and visual enhancements are perceivable easily unlike audiophile snake oil unless the person is blind  whistling.gif
*
To those that cried RT just dropped too many fps, that's why DLSS is so important. They also need to remember last time we had to crank down lighting and shadow settings especially soft shadows because it dropped fps too much (LOL!)

Now RT is vastly superior and yes the fps drops.
terradrive
post Dec 22 2020, 09:37 AM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(imceobichi @ Dec 22 2020, 09:30 AM)
Why so many people complain 3080 hot?

My temp never exceeded 60 Celsius gaming in an aircon room

Using mesh case though
*
Not about the temperature, but the total heat output.
terradrive
post Dec 22 2020, 12:22 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(targon @ Dec 22 2020, 09:56 AM)
That price is already very tempting to begin with.
Until they realize how much that thing cost in the US alone
American s are paying more for it than we do in MY

There's no point in explaining it.....to ppl
They don't use 10900 k
They don't know it's actually doesn't run really hot during normal gaming
They don't know it's even cooler than zen 3 at idle n light loads
They rely on reviewers data to compare
And thanks to sham, octvb extract last inch of performance
*
QUOTE(cstkl1 @ Dec 22 2020, 10:01 AM)
thats y. as i mention. banyak orang dunno.
cause reviewer dumbo.

what octvb is
i am 5.3ghz 125w stock intel. idle 1amp 0.8w.

its gone far beyond what u guys think oc is.

heck if i wanted can go 5.5ghz on single or few threads u dont need a golden cpu.

only asus mobo fully supports it. also asus has sma on da board.
its insane if u know what z490 asus does.
on z590 i guess everybody will know with rocketlake.
*
QUOTE(TristanX @ Dec 22 2020, 10:07 AM)
Hard data is how you compare.

I won't use 10900K because

Hot (Cooling requirements)
It will make my place hot (air cond is only last resort)
No PCIE 4.0
Huge power consumption
Z490 is more "expensive"
Security issue over and over again

Also, extreme OC does not apply to everyone. Just wait for Rocket which is still 14nm. It's not a long wait.

I will give Intel a shot when they have 7nm.
*
like targon and cstkl1 mentioned, 10900k oced doesn't run full 250watts when gaming, I don't know how many games can hammer all 10 cores 20 threads atm, most games even 6 core 12 threads isn't even running at 80%. That's why 5600x is popular.

Only when you do productivity work that you'll be hammering the power consumption full time. But then again how many percentage of users really do the productivity work until 10/12/16 cores makes the difference?

With the reduction of price even for the 10850k,10900k, even 10700k and 10700f, it's a dang good budget option alternative to zen 3.

But the main meat is still the upcoming rocketlake, 4.4ghz 11900 equals 10900k's single threaded performance, 65watt 11900 is faster than full blown stock 10700k in multithreading, rocket lake is where your "intel is hot" is going to be stopped.

10 series intel really is the point where people should think about "intel is hot" thing, that "intel is hot" thing is the 8700k i'm using now lol, not the 10 series.

This post has been edited by terradrive: Dec 22 2020, 12:29 PM
terradrive
post Dec 22 2020, 01:24 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(imceobichi @ Dec 22 2020, 01:02 PM)
Lol apa tu

Since when gaming become deep physics need to care about heat output?

Enlighten me pls sifu
*
like zotac trinity 3080 running at more than 70C but asus tuf 3080 running at 60+C, the tuf is lower temperature but if both running stock 320watts, the heat output of both cards are just the same, no difference, and it'll all go to your room.
terradrive
post Dec 22 2020, 01:28 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(cstkl1 @ Dec 22 2020, 01:18 PM)
will tell u before march 😏😏😏. if it shit its shit. if its good its good.

it took amd 7nm to fight a 14nm cpu. and its not even 100% stable in their die fab. rma cpu in 2020 is a ridonkolous notion.
*
on full load intel uses alot more power on the second power state like that iinm, but on idling and lower power mode amd uses more power. It's just the inefficiency of having multiple dies stacked on the infinity fabric, and on low power it's hotter too. That's why you see 10400f is running so much cooler than 3600 even though on load the 3600 i just 10 watts higher on power consumption.
terradrive
post Dec 22 2020, 01:35 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(Skylinestar @ Dec 22 2020, 01:31 PM)
love for blower card?  tongue.gif i do love the position of the power socket
https://www.neowin.net/news/asus-launches-r...designed-blower
ASUS launches RTX 3070 Turbo card with a redesigned blower
user posted image
*
wonder how much it'll be, actually won't run bad if you undervolt it

I'll take one if there's cheap 3060ti blower with one 8pin connector haha
terradrive
post Dec 22 2020, 01:40 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(yimingwuzere @ Dec 22 2020, 01:37 PM)
These two points are right.

When Ivy Bridge came out, lots of people misread review data and complained, "this CPU is hot, stick to Sandy Bridge". Still went with a 3770K. Sure, Ivy's temps were hotter indeed due to the smaller, denser die size - but still generated less overall heat than Sandy Bridge. Add native USB3 support (SB's biggest issue) and it's a nobrainer to go for Ivy - only reason to stick to Sandy was to try and shoot for 4.8GHz++ stable allcore.

Same like people saying the R9 290X ran super hot - RTX 3080 is actually hotter, just because one GPU is designed to run at 90C doesn't make the 80C one less of a heat generator. The 6800XT at ~295W or so (ref design) isn't too far behind.
I recognize where Comet Lake is better than Ryzen. The pros don't circumvent the cons.

I already have a RTX 3080 dumping lots of heat into a room without 24/7 air conditioning. The last thing I need is a CPU running at 140W or more while gaming. Ryzen running hotter at close to idle isn't a big issue as it is when stressed. When given the choice between CPU or GPU as a heat monster, I'll pick the graphics card every time. Not going to bother with trying to balance thermals between two power hungry chips in a compact ITX case anyway.

Maybe when Alder Lake ships I might reconsider Intel CPUs, and even then it's subject to performance vs whatever is on Socket AM5 then.
*
Funny part bout the r9 290x. AMD engineers admitted that they designed the chip to run at constant 95C simply because at higher temps the heat transfer is much more efficient. They underestimated how bad of marketing/image of it running 95C by the public haha. It's totally fine that temp.
terradrive
post Dec 22 2020, 04:27 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(Bonchi @ Dec 22 2020, 04:22 PM)
to be fair, Ive personally never face bsods on my intel rigs unless bad sector related or due to hardware failure. Not like this kind of random corruption errors, hardware detection and compatibility bugs that i face on ryzen... which is getting quite annoying as ive changed alot of hardwares to isolate the problem. 2cpus, 3PSUs, 4GPUs, 4ssds, 3 rams and currently on the second mobo. Fresh install windows many times. The problems still happens so I can somewhat conclude it to be a ryzen platform issue. 

Intel problems apart from the security breach which is fixed but suffer some slight performance loss, I cant think of any that is isnt spec related.

Performance is good but with all these random headaches so my experience is rather bittersweet. And all the money spent to just live with the issues.
*
At least you face much less issues running RTX3080 than RX6800XT biggrin.gif
terradrive
post Dec 22 2020, 04:40 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(xxboxx @ Dec 22 2020, 04:29 PM)
Ryzen 1000 and 2000 series gain supporters due to their good performance with low price vs intel jacked up price and gimped CPU. How many years we only can get 4 cores 4 treads unless willing to pay premium lmao

Now the roles are reversed, AMD jacking up price because intel can't compete. If this continue AMD going to start gimping their CPU. intel is no saint, they reduce price because can't take the heat.

Being fucked up by these companies (and AIB in case of GPU) are bad enough but being defended by stupid and delusional fanboys just make it worse.
*
another good rumor is that the rocket lakes seemed to gonna be cheaper? like close to $300 and $400 for the i7 and i9
terradrive
post Dec 22 2020, 05:10 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(cstkl1 @ Dec 22 2020, 05:07 PM)
the other issue now is youtuber influence.
u CANNOT call them out.

ocn etc gn/hardwareunbox/buildzoid  fantarda all come out.

these three are the most arrogant sekf entitled noob geeks ever. i would slap 3 of them if they ever cross my path. a day jail time for assault is worth it
*
be a youtuber ba, we'll sappork you for calling them out icon_rolleyes.gif
terradrive
post Dec 22 2020, 08:29 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(Bonchi @ Dec 22 2020, 07:50 PM)
i see one of them teaching how to gain ram performance on a gigabyte thinking it’s some ram setting page which i am looking for......... and it’s just a 10minute garbage saying how to set XMP wtffffffff. doh.gif and another just OC with most things set to auto except voltage.

when i personally found the actual page with full control where all the primary and secondary timming shown are all in hexadecimals .. i dont see any of them mentioning about that page. And that page is heavenly... the complex and very detailed bios reminds me of why i liked gigabyte mobos.

And the mlcc saga. Is yet another copy and paste from all of them without proper understanding of what was the actual cause of the crashes and the difference between both capacitors that causes the behaviors.. and also a simple solution of using psu with low ripples and the importance on running at certain loads for higher effeciency that could solve the whole drama even for zotac users was never mentioned by them.

And the biggest irony... None of then mentioned about the need of good PSU and the load effeciency but rather even tried to assure users that a 650w is sufficient. laugh.gif
*
QUOTE(cstkl1 @ Dec 22 2020, 08:07 PM)
of course they are not gonna mention psu. they been telling buyers for years 550w is sufficient and 650w overkill for mainstream.

sometimes makes me wonder. since theres no public electrical spec of zen.. i wonder how many issues are actually some form of termination clash or loadline on handling transient.

just google any intel cpu and u will be armed with whats high/low, safe etc. from intel. not a yellow/purple/red font in bios.
*
I cringed when in HBU stream video they "assured the viewers" that 650watt psu is perfectly fine.... for some 90-100 watt cpu and 3080 lol...

also linus is way way way legit as you guys had said, not only intel vs amd and amd vs nvidia, but apple vs android (biggest fanboy-ism ever)

This post has been edited by terradrive: Dec 22 2020, 08:30 PM
terradrive
post Dec 23 2020, 06:37 AM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(Bonchi @ Dec 23 2020, 02:10 AM)
to the point that they have to turn around, eat their words and start to develop along with microsoft’s directML which is literally DLSS eventho they just said DLSS is crap laugh.gif

but their hardware is not ready unlike the dedicated tensor cores (neural networks) readily available on the nvidias. Which could mean nvidia still way ahead as they can switch to DirectML if it gone mainstream while still maintaining better performance uplift by utilizing the tensor cores while current radeons need to sacrifice some compute units to run it and also being inefficient as it’s not designed to specifically calculate matrices.

Unlesss... amd also start adding neural network processors but nvidia is already on the third generation... long way to catchup.
*
not to mentioned neural netoworks needs to be trained, that's alot of time to catchup, especially nvidia had own supercomputers to run the training.

48 Pages « < 22 23 24 25 26 > » Top
 

Change to:
| Lo-Fi Version
0.5545sec    0.50    7 queries    GZIP Disabled
Time is now: 14th December 2025 - 12:38 AM