Jump to content

NVIDIA Killed Their Own Products (SPONSORED)

Plouffe

NVIDIA has made some massive improvements to DLSS with the 2.0 overhaul and subsequent updates, but can our staff still tell if it's turned on or not?

 

Thanks to LG for sponsoring this video! Get more information on the LG OLED TV G1 at https://bit.ly/Linus_LGOLEDTV_YT

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Even with youtube compression (at 4k) and a 1080p display , i could see the difference. Can only imagine how obvious it is to those with a critical eye in person, though i'm not surprised with the results.

 

DLSS is great however, its better than the trash post-process AA predominantly used today due to the abundance of / move to differed rendering solutions vs forward rendering which could use traditional multi sample AA during the rendering process.

 

The move to differed rendering ruined AA and is the reason we see so much more either aliasing (with PP-AA off) or blurriness (with PP-AA on) vs older games that could run AA options like MSAA.

 

The 2 big problems with DLSS is, 1) Moving image resolution (as Linus pointed out), and 2) its proprietary.

1) can be improved over time

2) probably could be addressed but likely wont be.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, SolarNova said:

The move to differed rendering ruined AA and is the reason we so much more either aliasing (with PP-AA off) or blurriness (with PP-AA on) vs older games that could run AA options like MSAA.

Guilty as charged:

1413108836_DesktopScreenshot2021_07.03-19_37.51_07.thumb.png.bcbd950e2c410b8857bf3cdef665f59d.png

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

Who on this planet plays at 720p anymore? Obviously that's going to look bad even on a 1080 monitor...

 

3 minutes ago, SolarNova said:

The 2 big problems with DLSS is, 1) Moving image resolution (as Linus pointed out), and 2) its proprietary.

1) can be improved over time

2) probably could be addressed but likely wont be.

I'm hoping that in a few years FidelityFX will become as or nearly as good as DLSS 2.0 which will be a big kick in the balls to Nvidia.

Personally tho I'll usually have any AA DLSS etc disabled, unless it directly benefits my gameplay (usually doesn't). Mostly because I like many don't own and likely won't own a 4K monitor.

According to Steams survey about 70% use 1080 and only 2-3% use 4K so this whole video revolves around 3% tops of the gaming community, based on said survey.

Top 10 Amazon monitors sales wise only 1 of them are 4K, it's currently at #3. Out of the top 50 all but 17 are 1080, and only 6 of those are 4K all the rest (but one weird one) are 1440p.

 

So the ending line for Linus is basically BS... Manufactures will make what people buy, people don't buy what manufactures make (usually) if they want something else, most gamers will go for high hz 1080 or 1440, not low hz/fps 4k.

Even the local Canada Computers sells only 32 4K monitors yet sells 106 and 196 units of 1440p/1080p monitors (just displayed not what they may have in stock in stores), I understand this is basically an ad for a TV but still I'm sure they could have used this content to actually help people in to what card/monitor to buy... Wasted opportunity.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Plouffe said:

NVIDIA has made some massive improvements to DLSS with the 2.0 overhaul and subsequent updates, but can our staff still tell if it's turned on or not?

 

Thanks to LG for sponsoring this video! Get more information on the LG OLED TV G1 at https://bit.ly/Linus_LGOLEDTV_YT

 

 

 

Like what? Is DLSS a new thing?  Why does this video exist? The only subtle thing I found in this video is how subtle it is at advertising Nvidia, because prices are falling on GPUs especially nvidia ones?  lol 

Like from a compressed footage watching it on youtube on a smaller window (not fullscreen) I was able to tell in the first seconds of this video that the left side stream was using DLSS.  like dudes aside from all the fine detail loss in the threads of the jacket hair etc, just look on the huge door in the center of each footage lol I find this ridiculous to even argue about it
DLSS is a COMPROMISE, nothing to do with PC gaming! it is a technique that should be on consoles, when you buy a PC you buy it for the extra fidelity and performance if you wanted smeared shit on a stick you would have bought a cheaper console. 

 

PC gamers want cheaper GPUs because prices are ridiculously overpriced they dont want cheaper performance at the cost of quality, In other words they want your good gear cheaper not your shitty gear costing the same but adding some FPS if you smear the image~! @Nvidia

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Asbo Zapruder said:

This is a pretty good example of the inclusion of the sponsor hurting the content.

 

Any kind of upscalling or supersampling is much, much harder to detect when sitting 5 or more feet away from a massive screen. It's easier to detect on a monitor, even a 4k one, because you're right up close to it. Even with Youtube compression I doubt there's many watching this on a monitor who couldn't immediately tell the difference between native and DLSS. And I don't say that to diss DLSS, I LOVE it.

 

Add to that the fact that DLSS is only available on high-end PCs and the vast, vast majority of people who own those don't play on 4k TVs, and the whole video is kinda pointless.

Although I agree with what you said 100%, I bet DLSS is shitty enough to be detected on a TV screen at a distance unless you are not a gamer and dont know what to expect form quality settings. 

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, papajo said:

Like what? Is DLSS a new thing?  Why does this video exist? The only subtle thing I found in this video is how subtle it is at advertising Nvidia, because prices are falling on GPUs especially nvidia ones?  lol 

[2]. Super weird timing.

Link to comment
Share on other sites

Link to post
Share on other sites

the scene 10 seconds in looked clear to me? right image being native, its way sharper in every way.

Link to comment
Share on other sites

Link to post
Share on other sites

A video
1(sponsored by LG) that also plugs
2) Floatplane and
3) LTT store,
4) while also looking like an advertisement for midrange Nvidia GPUs

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Egg-Roll said:

Personally tho I'll usually have any AA DLSS etc disabled, unless it directly benefits my gameplay (usually doesn't). Mostly because I like many don't own and likely won't own a 4K monitor.

According to Steams survey about 70% use 1080 and only 2-3% use 4K so this whole video revolves around 3% tops of the gaming community, based on said survey.

I'm old enough to remember being asked why anyone would want to game at 1024×768—at 1024×768 or above, just to be clear—so make no mistake, 4K gaming is going mainstream, the only question is how long it will take. But it might be far enough in the future that it's not worth worrying about 4K capability when you buy a card today.

5 hours ago, Granular said:

4) while also looking like an advertisement for midrange Nvidia GPUs

I think the video's message might actually be a bit of a mixed blessing for nVidia, given that a fair number of people are going to hear it and think "maybe I don't need that 3080, really". In the future we might see nVidia try to claw back some of the benefits of the DLSS-upscaling freebie in one way or another, at least if it still doesn't have any serious worries about competition in the future.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, SolarNova said:

Even with youtube compression (at 4k) and a 1080p display , i could see the difference. Can only imagine how obvious it is to those with a critical eye in person, though i'm not surprised with the results.

 

DLSS is great however, its better than the trash post-process AA predominantly used today due to the abundance of / move to differed rendering solutions vs forward rendering which could use traditional multi sample AA during the rendering process.

 

The move to differed rendering ruined AA and is the reason we see so much more either aliasing (with PP-AA off) or blurriness (with PP-AA on) vs older games that could run AA options like MSAA.

 

The 2 big problems with DLSS is, 1) Moving image resolution (as Linus pointed out), and 2) its proprietary.

1) can be improved over time

2) probably could be addressed but likely wont be.

I really wish FSAA was still a thing.  It had one core advantage over other super sampling options: It didn't break unscaleable UIs.  Sure, I can run a game at 5K on my 2.5K monitor and it's amazing looking on a modern game!  But there's a bunch of older games where that makes the UI downright microscopic.  You didn't have that issue with FSAA settings in the late 90s through mid 2000s

Desktop: Ryzen 9 3950X, Asus TUF Gaming X570-Plus, 64GB DDR4, MSI RTX 3080 Gaming X Trio, Creative Sound Blaster AE-7

Gaming PC #2: Ryzen 7 5800X3D, Asus TUF Gaming B550M-Plus, 32GB DDR4, Gigabyte Windforce GTX 1080

Gaming PC #3: Intel i7 4790, Asus B85M-G, 16B DDR3, XFX Radeon R9 390X 8GB

WFH PC: Intel i7 4790, Asus B85M-F, 16GB DDR3, Gigabyte Radeon RX 6400 4GB

UnRAID #1: AMD Ryzen 9 3900X, Asus TUF Gaming B450M-Plus, 64GB DDR4, Radeon HD 5450

UnRAID #2: Intel E5-2603v2, Asus P9X79 LE, 24GB DDR3, Radeon HD 5450

MiniPC: BeeLink SER6 6600H w/ Ryzen 5 6600H, 16GB DDR5 
Windows XP Retro PC: Intel i3 3250, Asus P8B75-M LX, 8GB DDR3, Sapphire Radeon HD 6850, Creative Sound Blaster Audigy

Windows 9X Retro PC: Intel E5800, ASRock 775i65G r2.0, 1GB DDR1, AGP Sapphire Radeon X800 Pro, Creative Sound Blaster Live!

Steam Deck w/ 2TB SSD Upgrade

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leoc said:

But it might be far enough in the future that it's not worth worrying about 4K capability when you buy a card today.

Top of the line GPUs can just handle it right now outside of Minecraft and cherry picked games, so I would expect at least another 4-5 years if AMD and Nvidia can keep up with the gains they gotten so far. So the hardware needs to catch up to the monitors now, and the monitors prices have to drop too while increasing in hz. I too remember when 1080 was a stretch and anything over 60hz was almost nonexistent.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Egg-Roll said:

I too remember when 1080 was a stretch and anything over 60hz was almost nonexistent.

I remember BEFORE that point when 1600 x 1200 was possible and 75hz+ was also capable.

 

The timeline of resolution and hz dips significantly with the development and market flooding of LCD displays, which were until very very recently worse than the technology they replaced (CRT). People didnt move from CRT to LCD because LCD looked better, they done it for the physical size and energy usage.

 

The biggest impact to FPS figures hasnt been monitor technology, its been software development. More complex and power hungry game engines etc. 4k gaming has been around arguably since the GTX 780, which was one of the 1st gaming capable 4k cards, ofc u wouldnt dare try run a modern game at 4k with it, but back then you could, even more so with 2 in SLI.

One could say that the current gen 3080 is to 4k 120hz as the 780 was to 4k 60hz. it 'can' do it, but in 9/10 games it wont.

 

Monitor resolution and Hz isnt the thing that needs major development, we're already there. What we need is new better display technologies to be brought to the mainstream. IPS, TN, and VA LCD tech are old, and all have significant failings that , for the most part, cant be overcome till we move onto something new.

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

he finale said it.

gpu simple cant do native anymore with out upscaling it. due to size,cost,heat.

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flo ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3000 MHz | Corsair RM1200i |150tb | Asus tuff gaming mid tower| 10gb NIC

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, SolarNova said:

The timeline of resolution and hz dips significantly with the development and market flooding of LCD displays, which were until very very recently worse than the technology they replaced (CRT). People didnt move from CRT to LCD because LCD looked better, they done it for the physical size and energy usage.

Partially true, my first LCD/LED monitor vs the old CRT unit was technically a downgrade for resolution but not in hz it was still at 60, I still missed the old CRTs however while owning that monitor.

 

7 hours ago, SolarNova said:

The biggest impact to FPS figures hasnt been monitor technology, its been software development. More complex and power hungry game engines etc.

It's partially monitor tech, you can't push hundreds of frames to a 75hz monitor, but only 75, optimally you'd want a monitor to display as many frames as possible for a smoother gameplay. Our current bottlenecks are hardware (GPU mostly, 4K monitors came out to the masses with 60hz+), and shitty game coding on the final product because it just works so why fix it. So we either need cleaner efficient code for everything (good luck on that, esp with Windows) or significantly better hardware which takes time to develop. As you said technically the 780 could do 4K, we are still in that technically stage, far better now but we are still in it. We will have to wait another few more years for GPUs to catch up, could be 2, 4, or even 20 years we simply don't know, those who do know won't tell us.

 

7 hours ago, SolarNova said:

Monitor resolution and Hz isnt the thing that needs major development, we're already there. What we need is new better display technologies to be brought to the mainstream. IPS, TN, and VA LCD tech are old, and all have significant failings that , for the most part, cant be overcome till we move onto something new.

What would you prefer nano IPS stuff maybe OLED? Only player in that category seems to be LG (nano), they are ridiculously expensive (compared to normal IPS) and don't fix the whole more FPS issue caused by the higher resolutions, game devs and hardware limitations elsewhere. IPS TN and VA are all great in their respectable fields (IPS is getting better than the other 2, and TN is slowly dying), in fact the IPS relative to my monitor (which weirdly displays as IPS in the title on LGs website but is a TN display) is currently only $50 more than what I paid for the one I own bought 1.5 years ago and it's far superior than it in every way as well. So unless nano tech comes down in price or OLED comes to monitors we are stuck with what we have, as TVs are meant for TV stuff not gaming unless you are a pleb console gamer 😜 (no offense to console gamers who only buy them for the exclusive content, gotta do what you gotta do right?). Plus who in their right mind would daily run a 48" TV as a monitor for a computer? Esp with a price tag of over $1000...

 

Simply put you could have a OLED (or your choice of a "perfect" monitor) with 4K perfect specs but your current system won't function as well as it could in a few years time because of all the other issues we have that are preventing 4K becoming mainstream, I'll likely not get a 4k monitor for another 6-10 years because it won't likely become affordable to build a computer around it in 3-5 years time, tho my next built will likely include a new IPS/Nano-IPS monitor. In order for a 4k setup to become mainstream it'll need the systems output power without using gimmicks under $1500 all costs in, less monitor and peripherals. Technically speaking my 5700 XT could do 60FPS 4K if I was willing to dummy down my in game settings, but realistically why should I? Why should I suffer quality loss for more pixels? If you want the best of the best gaming FPS for 4K you're still stuck paying over $1000 for the card alone, know where you can build a new computer for under $200-500+card that isn't shit or using used parts? I don't.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Egg-Roll said:

Plus who in their right mind would daily run a 48" TV as a monitor for a computer? Esp with a price tag of over $1000...

My current 1080p display is a 42" Panasonic Plasma that i bought a decade ago. (It cost £1000 when i got it)

I've been daily driving it ever since.

So in answer to ur question ... I would, in my right mind, daily run a TV as a monitor with a price tag over $1000. 😛 and i fully intend to upgrade to a similar size once i can get both a 48" (or maybe by the time i do, get a 42") and a HDMI 2.1 capable GPU for a reasonable price (both for under £2000 total)

 

14 hours ago, Egg-Roll said:

Partially true, my first LCD/LED monitor vs the old CRT unit was technically a downgrade for resolution but not in hz it was still at 60, I still missed the old CRTs however while owning that monitor.

The last CRTs my family owned for PC use, (basically me and my dad) were Sony Trinitron displays. They could do sub 1600x1200 at above 60hz (~85hz natively iirc), or 1600x1200 at 60hz natively.

Ofc at the time i was a kid and had limited knowledge but in retrospect those displays could have been pushed higher and faster, alas they were disposed of eventually 😞

Anyway the point is, the LCD monitors that came out to replace them were trash in comparison and is the reason i , after trying a Monitor, and trying a LCD (CCFL) 37"TV over a couple months, eventually returned them and went to a 42" Plasma.

 

14 hours ago, Egg-Roll said:

What would you prefer nano IPS stuff maybe OLED?

There are only 3 technologies im aware of that are any measure of different from current LCD, that are even remotely close to being fully marketed.

OLED, MicroLCD, Dual Layer LCD

LG's marketed 'Nano IPS' isnt substantially different from anything else and suffers the same drawbacks as other currently available options.

Out of the 3 listed above only OLED is close to monitor size availability. MicroLCD still needs to go through miniaturization to get down to Monitor size and resolutions, and Dual Layer LCD has only been in Hisense TVs atm, havnt heard much else about it.

 

I don't particularly care 'what' replaces current tech, so long as something does that can produce a quality image without all the drawbacks that LCD introduced. For all its limitations due to the time period they were around, CRTs had none of the current LCD flaws in relation to the image it produced. Its flaws were all physical (size, heat, power cost etc).

Can you imagine what a modern day CRT would be like if we had the last 2 decades spent developing and miniaturizing the tech further. They would sh*t all over current LCDs.

 

But i digress, i still stand by my point in that resolution and frequency are not currently the things we should be concentrating on developing further.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

there is micro oled and led tech to.

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flo ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3000 MHz | Corsair RM1200i |150tb | Asus tuff gaming mid tower| 10gb NIC

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, SolarNova said:

My current 1080p display is a 42" Panasonic Plasma that i bought a decade ago. (It cost £1000 when i got it)

I've been daily driving it ever since.

So in answer to ur question ... I would, in my right mind, daily run a TV as a monitor with a price tag over $1000. 😛 and i fully intend to upgrade to a similar size once i can get both a 48" (or maybe by the time i do, get a 42") and a HDMI 2.1 capable GPU for a reasonable price (both for under £2000 total)

You're not a typical person if you're running a screen size that big 😛 It's almost the equivalent of having 4 24" monitors in a 2x2 format.

Truth be told I was looking at a Plasma TV way back when for a TV lol.

My issue with the size is the distance I would need to actually see everything when playing games, no thanks 😂

 

8 hours ago, SolarNova said:

Ofc at the time i was a kid and had limited knowledge but in retrospect those displays could have been pushed higher and faster, alas they were disposed of eventually 😞

I had to help move a 32" or around there CRT, wasn't exactly "fun" lol

I do know they were significantly better than the LCD monitors of the day, but I chose LCD due to weight plus it had 60hz (I think, I would need to find it lol)

8 hours ago, SolarNova said:

I don't particularly care 'what' replaces current tech, so long as something does that can produce a quality image without all the drawbacks that LCD introduced. For all its limitations due to the time period they were around, CRTs had none of the current LCD flaws in relation to the image it produced. Its flaws were all physical (size, heat, power cost etc).

Can you imagine what a modern day CRT would be like if we had the last 2 decades spent developing and miniaturizing the tech further. They would sh*t all over current LCDs.

That's fair, I've not really looked into the newer display techs, but I can't see OLED coming to monitors for a while longer which is a little odd that it's taking so long, maybe manufacturing still needs work, phone industry is consuming the manufacturing or they don't see the money in it. What I fear is the newer better tech will become the way of the Plasma, TV only never making it to monitors.

 

for CRTs it's hard to tell for them to become better we would need a grid system esp for bigger sizes and that doesn't change the power issue, I would love to see them return but considering electricity isn't exactly plentiful or at least cheap and return would result in a failure.

 

I do understand IPS has its issues but they are the best we have right now. All we can do is wait and hope the next best thing gets given to us.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×