Jump to content

After watching the Nvidia Gamescom event I've had something in the back of my mind that is bugging me.

Jensen Huang kept talking about this AI that could improve visuals and would upscale images to improve the ray tracing images to make higher quality images with less computer power. If that's the case... then wouldn't it mean that this technology could be applied into upscaling images, improving the quality of movies or even bring better visuals to older games. I doubt that it could make a huge difference as low polygon counts are still low polygon counts but could this technology be used for say taking a game that's built to run on nothing higher than 1080p or heavens forbid 720p and upscale it to 4k without any strange distortions or blockiness that would come from a standard upscaling protocol.

 

Idk maybe I'm getting excited for a technology that won't exist but I hear potential for it. What do you guys think?

Link to comment
https://linustechtips.com/topic/962898-rtx-cards-and-upscaling-speculation/
Share on other sites

Link to post
Share on other sites

My understanding was the deep learning was trained to improve/denoise the raytraced output. It isn't clear to me if this is applied to just the RT content, or the composited output.

 

"better" upscaling could be interesting. Some games support different render and display resolutions already e.g. the 3D content is done at lower resolution, with native resolution UI elements, and in that case it could help.

 

Actually, they have mentioned something like this, was it DLAA? It was sold as anti-aliasing, but same principle could apply for upscaling...

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Alienware AW3225QF (32" 240 Hz OLED)
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, iiyama ProLite XU2793QSU-B6 (27" 1440p 100 Hz)
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to post
Share on other sites

11 hours ago, andpeterson said:

After watching the Nvidia Gamescom event I've had something in the back of my mind that is bugging me.

Jensen Huang kept talking about this AI that could improve visuals and would upscale images to improve the ray tracing images to make higher quality images with less computer power. If that's the case... then wouldn't it mean that this technology could be applied into upscaling images, improving the quality of movies or even bring better visuals to older games. I doubt that it could make a huge difference as low polygon counts are still low polygon counts but could this technology be used for say taking a game that's built to run on nothing higher than 1080p or heavens forbid 720p and upscale it to 4k without any strange distortions or blockiness that would come from a standard upscaling protocol.

 

Idk maybe I'm getting excited for a technology that won't exist but I hear potential for it. What do you guys think?

Yes, although it would just be the machine’s “imagination” of what a more detailed image would be. Don’t try using it as a movie magic enhance function of you will be seeing made up data.

Link to post
Share on other sites

Quote

The reality of the RTX 20 Series that releases next month is this: it's a money-grab designed to get early adopters on the ray tracing hype train for the 20 or so games that will ship with the feature. It's a stopgap to  7nm cards that should arrive in 2019 and offer substantial performance gains and power efficiency improvements. And as for the price tag, Nvidia can charge whatever it desires due to lack of competition in the high-end space.

Seriously, glance at the clock speeds for the 20 Series. Check out the unimpressive CUDA core increase over the 10 Series. Realize that the memory configuration is the same as the 10 Series (albeit with GDDR6 instead of GDDR5). Take a hard look at what the performance increase should be. Most in the tech media are putting it at maybe 10% to 15% over the 10 Series when it comes to the majority of games out there. But you'll pay 40% to 50% higher prices for this generation's replacements based on MSRP. And you know we won't be paying MSRP. . .

Ray tracing is bleeding edge right now, but do you want to spend an exorbitant amount of money to enjoy a handful of games optimized with it?

Sorry but I just copy and pasted this from forbes.

 

Scary to say AMD cards are around the corner with better nm size and everyone knows that 7nm are in the works.....it really does seem like RTX is just there to pass the time. They are probably saving their "ace" for when AMD releases new cards so they don't look as dumbfounded as intel did 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×