Search the Community
Showing results for tags 'dlss'.
-
Hi, I'm currently doing one of my last SAT for one of my units and its a report on a chosen subject, I've decided to go with DLSS and I'm currently in the process of collecting unmanipulated, secondary data. I'm kindly asking (kind of desperately) for anybody who has any spreadsheets comparing DLSS on and off between different GPU's and is willing to share towards my study, since when I can't find any on the web, only manipulated data in forms of graphs. Any advice to get this data would be appreciated too, Thankss Also planning to collect Primary data very soon too! Forgot to specify, The focus on my study is the increase of performance with DLSS. The research question is: To what extent does the utilisation of Deep Learning Super Sampling (DLSS) impact the efficiency of frame generation in computer graphics?
-
Hi I am looking to upgrade my computer as I am having trouble playing games in 4K or 1440p even with DLSS like Hogwarts Legacy which I haven't been able to play for a while because even using DLSS heavily the game is unplayable. I am looking at either Gigabyte NVIDIA GeForce RTX 4060 GAMING OC Graphics Card - 8GB @95.99 https://shorturl.at/jEJPW Gigabyte GeForce RTX 3060 Gaming OC 12GB V2 @ £285.99 https://shorturl.at/gqsB6 XFX SPEEDSTER SWFT 319 6800 XT @ £373 (Imported) https://shorturl.at/yEIP7 I am looking at running this card for multiple years so looking either at NVidia DLSS to improve framerate or the raw power of the AMD. I am looking for advice on which of the Nvidia card would be better either for the extra vram or the more modern architecture. Also the AMD is mentioned but not really likeky to buy to know whether the AMD card is worth the extra money as I would also have to replace my PSU as another £120. The rest of my system is Ryzen 3600, 32GB 3600 Ram, and currently a Nvidia 2060
-
I have an RTX 4080 (full specs in sig) and on COD: MW2, I can can select either DLSS or DLAA or even turn it off for native gaming. I am gaming at HDR 400 True Black at 3440x1440 144Hz on mostly high/ultra settings and always want to hit at least 144fps with G-Sync on. (I have Nvidia boost low lag thing on). I want it to look as best as possible, whilst being low lag and hitting the above frames. Which one should I use? (P.S. There is also an AMD FSR 1.0 option. Can anyone tell me what this is and whether I should use it or not?)
-
Hello, I am struggling between what graphics card to pick pout of the 3060ti and the 6700xt. I currently have a 2060 super and am wanting an increase in performance but mainly a cooler and quieter card. I mainly play forza horizon 5, rainbow six siege and modded Minecraft with shaders. I'm not a big fan of dlss so that isnt important to me and rtx is a hit or miss for me so that isnt a neccesity. based of YouTube comparisons I just cant decide what the better option would be for a cooler and quitter card, which is why I need your help. From what ive seen gddr6x on the 3060 ti consumes around 25w on average compared to gddr6 with at most a 3fps gain. because of this i cant decide what is best as i want the most efficient card but I'm not that knowledgeable about amds drivers at this current time. the price does not matter at this time because they are so similar I would love your 'professional' opinion to help me decide, thanks
- 4 replies
-
- 3060 ti oc
- 6700xt
- (and 4 more)
-
Is DLSS good in Minecraft Java edition? literally don't know, I don't even own the card yet, which is why I want to be sure
- 13 replies
-
Today I found out about a feature Nvidia has in there display settings that will automatically upscale games and Windows without a rtx card. I learned about it from a channel called RandomGaminginHD, which is the biggest channel to cover it. It's super easy to find too. It's literally the second options in advance settings. It was able to take Dying Light 1 on my gtx 960 2gb from 40fps to around 60fps with minimal graphical lost. Does anyone know why this isn't talked about?
-
Is there a reason that the Titan V does not support DLSS even though it has tensor cores? Is it purely a driver thing? I got my hands on one from a friend who upgraded his work machine and its an incredibly fast card almost matching my 2080ti. It has more tensor cores and could perhaps performs better if DLSS was usable.
- 2 replies
-
- dlss
- titan volta
-
(and 2 more)
Tagged with:
-
So i play warzone alot and heard that i can use dlss but there is no option to do that :l, i tried to find dlss in the config.ini but i wasnt able to find it cause its kinda crypted enyone want to help or share how ur config.ini looks like when dlss is enabled so i can figure out what it is in my config.ini
-
AMD has long needed FidelityFX Super Resolution to be competitive with Nvidia in ray tracing performance, and now that it’s finally here, could it possibly put up a fight? Does it have to? Thanks to 3kliksphilip for permission to use clips from his excellent DLSS low-res video:
- 30 replies
-
- amd
- fidelityfx
-
(and 4 more)
Tagged with:
-
Recently more information becomes public about RDNA 2, DirectML/DeepLearning benchmarks and AI based FSR. Could you please suggest if these information is legit? RDNA RX 6000 GPUs really are now capable of competitive to RTX cards, DeepLearning/DirectML features? Does it mean, we can expect that RDNA 2 / Radeon 6000 GPUs, will benefit from the upcoming AI based FSR/Upscaler? Sources: https://www.google.com/amp/s/wccftech.com/amd-microsoft-bring-tensorflow-directml-to-life-4x-improvement-with-rdna-2-gpus/amp/ and https://segmentnext.com/amd-fidelityfx-super-resolution-ai/ btw DLSS 1.0 was not great but was not too horrible and was not using Tensor Cores, from 1.9/2.0 they started to use tensor cores, so in theory even with ZERO directml utilization, AMD could still match DLSS 1.x? http://forum.notebookreview.com/threads/tensor-cores-and-dlss-non-rasterised-performance-of-nvidia-ampere-vs-turing.834939/ So with little of DirectML they could potentially add error correction as DLSS 2.x has and improve FSR with Deep Learning?
-
I got my new graphics card and I realized that DLSS isnt working, I tried playing cyberpunk and I turned dlss off and on and didn't make a difference, I have tried deleting the graphics driver and installing it again but that didn't work
-
I recently got a 3070 and have been struggling with DLSS, I have had it work once and that was only when I play fortnite with rtx, other games like ghostrunner and cyberpunk dont get affected by turning on DLSS. I have tried uninstalling my graphics drivers multiple times and that did nothing
-
If i buy a 1660ti instead of a RTX 2060, I am gonna miss out on DLSS 2.0. Is it going to be a big deal?
-
Hi all, thanks for reading, I've seen a real mixed bag online whether the 2080ti supports dlss at 1440p in titles like Cyberpunk. Does it? If so, is it the same tech as in the 3000 series cards? Thanks, Nick.
-
Hi all, Thanks for reading my post. I'm looking at upgrading my pc and don't know whether to go with a 6800 or 3070. I know the 6800 is the stronger overall card, but the 3070 has raytracing and dlss, both nice for Cyberpunk. Is raytracing and or dlss worth it yet, i.e. have they progressed enough to make them worth getting simply for the few titles like, Cyberpunk for example. Here in Australia, they're both out of stock essentially everywhere, but they both cost the same amount, $949 aud. I guess my question is; Are raytracing and dlss technologies that have proved themselves useful for this generation of graphics cards and games, or will the 3000 series simply become another testing ground for these technologies, much like the 2000 series. I currently run an R5 1600 and a b350 if that means anything in this scenario. thanks, Nick
- 8 replies
-
- dlss
- ray tracing
- (and 4 more)
-
Hello there, So my last gpu GTX 1060 6 GB broke. For a while I had random black screen flickering and now it stopped responding at all. No display outout and after booting up the fans just stopped spinning. So I changed my gpu with my very old gtx 750 ti while I buy a new one. Now I'm really confused which one to pick between RTX 2060 at 26K INR(325USD) or RTX 3050 at around the same price it seems. Benchmark shows that 2060 still outperforms 3050 by 30%. But it has better RT cores. And just Nvidia launched 4000 series and DLSS 3.0 I am wondering if 2060 is still a good option for 1080p 60fps at high settings. DLSS 3.0 will only work in 4000 cards because it has newer RT cores. But still we have chance they can implement it on 3000 cards in future. Also 3050 consumes less power around 50W. My plan was to wait for a year or more till newer budget gpu comes. But those are gonna be more expensive and power hungry. Seems I have to buy now. Let me know guys what you think. my system: Ryzen 3 3100 PSU: Corsair cx 550
-
First time posting here so let me know if I did something wrong. Also take everything is say with a grain of salt, only going based off my own anecdotal evidence as a game developer hobbiest. When I watched LTT's video about Asynchronous Reprojection and they brought up a good point I thought before about why this is only in VR. After some thought I have a few theories on why it's only on VR. For this post I'm going to call it ATW for short (what Oculus/Steam calls it) First of all, VR usually doesn't have anything as a secondary layer on the display so game motion isn't as effected by the frame rate. An example of what I mean is in a first person shooter there is usually a gun as a second layer on the camera. So with ATW, the gun would probably look smoother and stay with your input. The issue with this is the game itself will still be running at the original framerate so even if your gun is moving at 144fps, the enemy will be walking at 30fps. You will usually be way behind even if your gun is moving smooth. In VR this isn't an issue because the gun is attached to your hand and isn't nearly as precise relative to the camera. Your gun in VR will run at the same frame rate as the enemies. Other motion-based effects would also be effected by this such as animations, foliage, and especially fast moving objects like vehicles. Post processing is also probably a big issue with this. In VR, you don't have much post processing (mostly for performance). On desktop you have a lot (film grain, color adjustments, motion blur, etc.) which I can see being effected by the different frame rates and just how ATW works. VR didn't support Unity's URP (Universal Render Pipeline) for a while. I know a lot of PC gamers hate motion blur but it is somewhat necessary for "cinematic" frame rates (20-30fps). A lot of people thought Avatar 2 felt off when it switched between 24-48fps and the motion blur was shot/rendered with 48fps so motion didn't feel as smooth. Also, VR still should be running at the native frame rate. ATW is of course used for input smoothing but you will still experience plenty of lag with your controllers (by lag I mean lower frame rate compared to your visual input). Running a VR game at 36fps going to 72fps isn't a good experience which is probably why the Oculus/Meta Quest probably didn't ship with it (I think it used ATW system-wide but games still had to run at least 72fps.) With the Quest 2, Meta later released Application Space Warp which is much closer to DLSS 3.0 than conventional ATW. This actually gives a decent experience doubling your frame rate so games can actually run at 36fps and view at 72fps (though best results are at high frame rate like 60fps->120fps). It's not perfect and there are plenty of artifacts especially with close objects but the important distinction is the entire game looks smoother, not just head input. I asked John Carmack on Twitter a while ago about it and he does mention that it could be used on desktop gaming (which we see with DLSS 3.0). So to summarize, there is definitely a reason why ATW is on VR. I can't even see it going on AR/MR (since you have real world motion reference). It going on conventional "2D" gaming might make sense for some games where animations and movement is more static like Playstation's Dreams but something like CSGO would probably have a huge disadvantage. Even if you want higher input response like going from 144hz to 366hz or something, the people that would care about higher input response would probably care that what they see isn't the most recent information anyway. So why would NVIDA or Unity invest in porting this tech to desktop when a small margin of users might even use it if they don't get sick from the weird motion artifacts. DLSS and FidelityFX definitely seems like the right direction for non-VR gaming. Something built for the platform. Anyway, like I said, take what I said with a grain of salt. None of this has really been tested by myself, just using anecdotal evidence from working with this tech in my own side projects. Would love to know what someone smarter than me thinks. (and also let me know if I did something wrong posting on this forum)
-
DLSS 2.0 has came out, yet not many people talk about it. I've seen The Tech Chap's video about Nvidia's new DLSS 2.0 that can bring significant fps boost over their first attempt with the DLSS 1.0 After seeing that performance gain for free, I would like to ask for your opinion on whether it is worth it to get a RTX 2060 Laptop for RT gaming. This is the link of the video I'm talking about: https://youtu.be/eS1vQ8JtbdM Thank You. Azura Lightfeilt
-
Hi, I have an issue with my windows 10 build, and I was hoping for some feedback. I recently upgraded my graphics card from a GTX 1060 to an RTX 2060. The other day I wanted to benchmark my graphics card using Port Royal on 3D Mark and was told that my current Windows build (1803) does not allow for this benchmark to be run. Yesterday, after updating Anthem, I wanted to try out DLSS with the game, and the option is greyed out. From some cursory investigation it is clear that both issues are related to the fact I am not running Windows 10 build 1809, which has not yet updated in my system. Further digging around revealed that Windows 10 has probably "chosen" not to update to build 1809, because one of my PC components is not yet optimized to that version of Windows. I don't want to force an update that could potentially screw up my system and make me rollback the update or do a fresh install, so as you may understand I would rather do my due diligence before I do that. I am running a: Intel i5 8600K 16 GB of DDR4 G.Skill Jaw.Rippers 3200 MHz Gigabyte RTX 2060 Windforce OG MSI Z370-A Pro Mainboard Samsung EVO 840 256GB SATA SSD - For the OS, Steam, Origin etc. 1000GB WD Green WD10EZRX Intellipower 64MB 3.5" (8.9cm) SATA 6Gb/s - For storage 480 Watt be quiet! Straight Power E9 CM Modular 80+ Gold Windows 10 Home edition v. 1803 Does anyone know if any of the components above have known issues with build 1809, or point me in the right direction. Many thanks in advance! R.
-
So Shadow of the Tomb Raider got the RTX patch and whaddaya know, it just works. DLSS actually looks good here. Better than without DLSS. NV seems to have used a sharpness filter here or sth. Raytracing doesn't do much other than the shadows, and AO, but the performance is decent. At this rate, raytracing shouldn't take as long as I thought to become mainstream.
- 1 reply
-
- nvidia rtx
- geforce rtx
-
(and 2 more)
Tagged with:
-
It has been discovered by the guys over at Tech Power Up, that DLSS used by Nvidia RTX cards will only work on some resolutions and not others, and it also depends on the video card you get. The RTX 2060 in Metro Exodus, will only support DLSS at 1920x1080 and 2560x1440, but not 4K, while the RTX2080/Ti, DLSS only works when it's running at 2560x1440 and 4K, but not 1920x1080. Nvidia's response for this was Whatever Nvidia's excuse is to limit DLSS on certain resolutions, Many users probably won't be happy, a feature found on their expensive card, will only work under certain resolutions.
-
Recommended GPU for playing Anthem and other AAA titles.
David_Kwan posted a topic in Graphics Cards
Hi guys, So on the official site the recommended gpu is an RTX 2060 for Anthem. But what gpu do you guys recommend to play Anthem (and other AAA titles) for 1080p max settings and 1440p high to max settings? I may use RTX and DLSS on 1080p if it seems playable. I plan to get a 60/75hz monitor or maybe 144hz depending on my budget. Let's say 75hz for now. If I were to play on a 75hz monitor would a gpu that can get me more than 75fps have some conflict with a monitor that is limited to 75hz? Thanks in advance! Edit: There's also the 6gb vs 8gb ram. Would future games need more than 6gb ram??- 53 replies
-
Now that DLSS and Ray Tracing is enabled I ran the in game benchmark on my system at max settings at 1440P. System is and full specs in sig.RTX 2080Ti9900K32 GB ramResults are:DLSS OFFRT Off - 107 FPSMed - 92 FPSHigh - 67 FPSUltra - 65 FPSWith DLSS ON.RT off - 123 FPSMed - 109 FPSHigh - 82 FPSUltra - 80 FPS
-
- ray tracing
- dlss
-
(and 1 more)
Tagged with: