Jump to content

I’m Embarrassed I Didn’t Think of This.. – Desktop Async Reprojection

What if you didn’t need the best frame rate to reduce input latency? What if your display’s refresh rate was enough all on its own? With Async reprojection, anything is possible… Even turning 30 FPS into 240.
 

 

Comrade Stinger's original video + download links: https://www.youtube.com/watch?v=VvFyOFacljg

2kliksphilip's video about async reprojection: https://www.youtube.com/watch?v=f8piCZz0p-Y

 

 

Emily @ LINUS MEDIA GROUP                                  

congratulations on breaking absolutely zero stereotypes - @cs_deathmatch

Link to comment
Share on other sites

Link to post
Share on other sites

So is this the kind of thing the game would have to be designed with?  Or is this something we'd be able to apply to a game on our own?  

(yeah yeah, probably a dumb question)

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Newton Pens said:

So is this the kind of thing the game would have to be designed with?  Or is this something we'd be able to apply to a game on our own?  

(yeah yeah, probably a dumb question)

Actually this is not a dumb question at all.

The answer is that, it probably depends. Every game takes in mouse inputs, and unless the movement from the async time warper matched those from the game, it likely would be distracting. And in terms of the camera moving around, that would be really difficult to add onto a game without more information on the inner workings of the game.

So the answer is, while this is totally something that could be added onto a game independently, unless there was a standard mouse or camera API, it would be unlikely to be effective.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Newton Pens said:

So is this the kind of thing the game would have to be designed with?  Or is this something we'd be able to apply to a game on our own?  

(yeah yeah, probably a dumb question)

This could likely be done using something like ReShade. 

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder if this can be implemented in already existing games with the new RTX Remix from Nvidia

Link to comment
Share on other sites

Link to post
Share on other sites

Here’s the next big thing: zoned resolution scaling and possibly frame rate.

 

In FPS games, I need full resolution in the middle zone, for example a circle around the target. Outside that circle I could easily scale down my resolution, preferably gradually, to 50%. If blurring is applied it could probably go even lower. I think this would be a cheap way to boost frame rate, especially on 4K, 8K, and super wide monitors. 

 

In some cases it could also be possible to reduce the frame rate outside the middle zone, such as if there are no inputs from mouse/keyboard. 

Link to comment
Share on other sites

Link to post
Share on other sites

Why not just increase the resolution then crop to avoid the stretching completely?

 

For example: in a 1920x1080 display, we render the game at 2000x1125 pixels, then crop to 1920x1080, the stretching then happens off-screen

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RevoltDemo said:

Why not just increase the resolution then crop to avoid the stretching completely?

 

For example: in a 1920x1080 display, we render the game at 2000x1125 pixels, then crop to 1920x1080, the stretching then happens off-screen

Came here to ask the same question.

Main System (Byarlant): Ryzen 7 5800X | Asus B550-Creator ProArt | EK 240mm Basic AIO | 16GB G.Skill DDR4 3200MT/s CAS-14 | XFX Speedster SWFT 210 RX 6600 | Samsung 990 PRO 2TB / Samsung 960 PRO 512GB / 4× Crucial MX500 2TB (RAID-0) | Corsair RM750X | Mellanox ConnectX-3 10G NIC | Inateck USB 3.0 Card | Hyte Y60 Case | Dell U3415W Monitor | Keychron K4 Brown (white backlight)

 

Laptop (Narrative): Lenovo Flex 5 81X20005US | Ryzen 5 4500U | 16GB RAM (soldered) | Vega 6 Graphics | SKHynix P31 1TB NVMe SSD | Intel AX200 Wifi (all-around awesome machine)

 

Proxmox Server (Veda): Ryzen 7 3800XT | AsRock Rack X470D4U | Corsair H80i v2 | 64GB Micron DDR4 ECC 3200MT/s | 4x 10TB WD Whites / 4x 14TB Seagate Exos / 2× Samsung PM963a 960GB SSD | Seasonic Prime Fanless 500W | Intel X540-T2 10G NIC | LSI 9207-8i HBA | Fractal Design Node 804 Case (side panels swapped to show off drives) | VMs: TrueNAS Scale; Ubuntu Server (PiHole/PiVPN/NGINX?); Windows 10 Pro; Ubuntu Server (Apache/MySQL)


Media Center/Video Capture (Jesta Cannon): Ryzen 5 1600X | ASRock B450M Pro4 R2.0 | Noctua NH-L12S | 16GB Crucial DDR4 3200MT/s CAS-22 | EVGA GTX750Ti SC | UMIS NVMe SSD 256GB /

TEAMGROUP MS30 1TB | Corsair CX450M | Viewcast Osprey 260e Video Capture | Mellanox ConnectX-2 10G NIC | LG UH12NS30 BD-ROM | Silverstone Sugo SG-11 Case | Sony XR65A80K

 

Camera: Sony ɑ7II w/ Meike Grip | Sony SEL24240 | Samyang 35mm ƒ/2.8 | Sony SEL50F18F | Sony SEL2870 (kit lens) | PNY Elite Perfomance 512GB SDXC card

 

Network:

Spoiler
                           ┌─────────────── Office/Rack ────────────────────────────────────────────────────────────────────────────┐
Google Fiber Webpass ────── UniFi Security Gateway ─── UniFi Switch 8-60W ─┬─ UniFi Switch Flex XG ═╦═ Veda (Proxmox Virtual Switch)
(500Mbps↑/500Mbps↓)                             UniFi CloudKey Gen2 (PoE) ─┴─ Veda (IPMI)           ╠═ Veda-NAS (HW Passthrough NIC)
╔═══════════════════════════════════════════════════════════════════════════════════════════════════╩═ Narrative (Asus USB 2.5G NIC)
║ ┌────── Closet ──────┐   ┌─────────────── Bedroom ──────────────────────────────────────────────────────┐
╚═ UniFi Switch Flex XG ═╤═ UniFi Switch Flex XG ═╦═ Byarlant
   (PoE)                 │                        ╠═ Narrative (Cable Matters USB-PD 2.5G Ethernet Dongle)
                         │                        ╚═ Jesta Cannon*
                         │ ┌─────────────── Media Center ──────────────────────────────────┐
Notes:                   └─ UniFi Switch 8 ─────────┬─ UniFi Access Point nanoHD (PoE)
═══ is Multi-Gigabit                                ├─ Sony Playstation 4 
─── is Gigabit                                      ├─ Pioneer VSX-S520
* = cable passed to Bedroom from Media Center       ├─ Sony XR65A80K (Google TV)
** = cable passed from Media Center to Bedroom      └─ Work Laptop** (Startech USB-PD Dock)

Retired/Other:

Spoiler

Laptop (Rozen-Zulu): Sony VAIO VPCF13WFX | Core i7-740QM | 8GB Patriot DDR3 | GT 425M | Samsung 850EVO 250GB SSD | Blu-ray Drive | Intel 7260 Wifi (lived a good life, retired with honor)

Testbed/Old Desktop (Kshatriya): Xeon X5470 @ 4.0GHz | ZALMAN CNPS9500 | Gigabyte EP45-UD3L | 8GB Nanya DDR2 400MHz | XFX HD6870 DD | OCZ Vertex 3 Max-IOPS 120GB | Corsair CX430M | HooToo USB 3.0 PCIe Card | Osprey 230 Video Capture | NZXT H230 Case

TrueNAS Server (La Vie en Rose): Xeon E3-1241v3 | Supermicro X10SLL-F | Corsair H60 | 32GB Micron DDR3L ECC 1600MHz | 1x Kingston 16GB SSD / Crucial MX500 500GB

Link to comment
Share on other sites

Link to post
Share on other sites

mark very mark, how I marked this video.

pro super ultra gamer going async into a coffee mug with a daily dose of tech news.

Retro gamer gaben, free steam deck giveaway during the award show.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, RevoltDemo said:

For example: in a 1920x1080 display, we render the game at 2000x1125 pixels, then crop to 1920x1080, the stretching then happens off-screen

performance? but yeah a lot of games have a boundary, where the camera is smaller than the rendering view. Where you mostly crop in a way what is shown at the edges. Not sure if that a standard in game engines or added by devs.

 

Also for better experience, one could do as with how intel does with screentearing or DLSS and AI does with images, to not make it blurry and fix the image on image rendering from mouse point? creating an inbetween image of the two? That uses AI/ML to stitch them together when moving around (if not too costly).

If working in the depth field, to create from the depth rendered images and having to stitch those depth models together could be an issue and more so, if dynamic instead of static.

 

Micro-mesh's ray tracing supporting more meshes. nanite, instantnerf, generative models, motion with DLSS 3, etc.

https://www.youtube.com/watch?v=j8tMk-GE8hY

https://developer.nvidia.com/rtx/ray-tracing/micro-mesh

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, RevoltDemo said:

Why not just increase the resolution then crop to avoid the stretching completely?

 

For example: in a 1920x1080 display, we render the game at 2000x1125 pixels, then crop to 1920x1080, the stretching then happens off-screen

Performance.

 

While on monitor that would be no-brainer to do but async reprojection was made for VR because even today VR demands a lot of rendering power if we don't do some tricks. As in 4K 60FPS is sill pretty high bar for HW if we go to the mid-tier setups BUT while original Vive more or less transfers 4K image 90 times in a second to the headset the rendering demand is ~2K image 2x90FPS as in the GPU must render two different frames at the same time and manage to hold that 90 FPS as steady as The Rock in a jungle movie. At the time when it was developed initially we talked about Arizona Sunshine running so well on high settings with GTX 1080Ti (the top card then) that it was recommended to have a bucket in your other hand ready for your lunch.

So the priority was to develop a system that allowed running AAA grade graphics in VR without people getting sick and the first solution was to separate player from the game so players movements can be rendered no matter what is the FPS of the game. Rendering a bigger frame would have required even more rendering power and harmed even more the performance.

Link to comment
Share on other sites

Link to post
Share on other sites

This not being widely implemented is a symptom of why I stopped being impressed by modern graphics. Devs completely stopped bothering to optimize their games. It's gone so far that Devs even stopped shipping non-broken games on release day.

 

I'm still marveling looking at early 90s games. The graphically impressive games weren't impressive merely because of the graphics, but because of the limitations they had to work with. But that era of gaming is long gone.

 

Since popular games like Pokémon already get away with being released completely broken, I wonder what the future of gaming will bring. Maybe they start selling games before they even start programming them eventually. But not in a weird kickstarter way. More like selling futures.

 

You know, this train of thought was meant as a ridiculous parody of what the future will bring, like when people joked about Tomb Raider 25 or Terminator 17 back in the 1990s. But I can totally see video games going in that direction and I can't even say that this would be more depressing than gaming already is.

Link to comment
Share on other sites

Link to post
Share on other sites

Seems you guys record the video out-of-date, as Comrade Stinger update his project and add some dynamic objects and some other improvements, and this is an important content you hadn't show in the video.

 

I'm surprised after watching your video. But after playing Comrade Stinger's demo, I think the video maybe a little misleading, as you didn't talk and show the limits and artifacts in the demo, just praising the genius innovation bring async reprojection to desktop. In my computer with 60Hz monitor, the jitter and tearing is obvious and unacceptable in low framerate below 30FPS, looks much worse than you showed in video.

 

I think the tech is quite awesome, but apparently not a better and mature solution to replace motion blur or even DLSS. Tearing and AA are big problems to be solved, because in that cases rendering rate isn't match framebuffer updating rate. Also it cannot improve appearances about dynamic objects like your teammates or enemies. But i agree that it has the potential to used with other techs to bring a cheaper and better game experience.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/1/2022 at 6:49 PM, BigNextThing said:

Here’s the next big thing: zoned resolution scaling and possibly frame rate.

 

In FPS games, I need full resolution in the middle zone, for example a circle around the target. Outside that circle I could easily scale down my resolution, preferably gradually, to 50%. If blurring is applied it could probably go even lower. I think this would be a cheap way to boost frame rate, especially on 4K, 8K, and super wide monitors. 

 

In some cases it could also be possible to reduce the frame rate outside the middle zone, such as if there are no inputs from mouse/keyboard. 

Both already exist. Zoned resolution scaling is basically variable rate shading and zoned framerate is basically tearing.

Edit:Also foveated rendering

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×