Jump to content

durrburger

Member
  • Posts

    63
  • Joined

  • Last visited

Everything posted by durrburger

  1. I'm using Windows 11 forgot to mention, but it was visible on Win 10 just before I upgraded. If you're referring to ingame HDR settings I've tried many combinations and it doesn't resolve the issue sadly, since the baseline colors are fundamentally wrong for these effects, mainly fire. Also it does seem to be a specific OLED issue/interaction with some games since I was playing the same game on the same PC with the older HDR TV before upgrading, and all these effects looked normal. Also, the only other mentions of these similar red/pink fire problems I've found online are all related to LG OLEDs.
  2. I'm really not sure if this is a TV or a GPU/PC topic but since I got the C1 a few months earlier, I've found a few examples where HDR is giving off very innacurate and wrong colors, but only in some games and only with some specific colors/sources. For example, in God of War and Dark Anthology: Little Hope, the fire effects are red/pink and plainly inaccurate with Kratos' blades's effects and the house fire at the start of Little Hope. Fire for both of these cases looks orange and normal on SDR, and it looked normal as well using my old Sony x900f TV in HDR as well, so this is specific to the C1 and to these games/scenarios. Not all fires in all games are affected, and no other colors seem to be affected either (although in GoW there are other minor examples like flags that are also wrong etc). I really have no clue what could be causing this or how to resolve it, and it's possible that it's a specific OLED issue not just the C1, as I've some people mention the same issues with older models and also consoles, so not just PC. Here's one post mentioning the same issue with Ratched and Clanks on PS5, and a Resetera thread where those pics came from which ALSO has GoW issues like me. The thread is discussing these issues for LG C8 which is 3 gens behind my C1 and the issues are the exact same. I'm using the TV as a monitor with a 3080 if it makes a difference, and I've tried every combination of nvidia control panel settings like RGB/YCBCR as well as the TV HDR settings, but nothing changes for these specific games. With every other game I've tested so far, no issues at all.
  3. I've been trying to get auto HDR to work on multiple titles that I've seen people confirm to support the feature, and I've never had the notification appear. The games just look like they have tone-mapped the SDR image to HDR just like toggling it on the desktop and launching a game in Windows 10. I don't have 2 systems with 10 and 11 to check side by side, but I can say for sure that no game I've tried looks anywhere as good with auto HDR as games with native HDR support, it just looks wrong with bright highlights but washed out blacks, and occasionally awful color banding and artifacts. Does not getting the notification means that isn't working properly, and if so, why am I not getting it? I'm using Sony x900f hooked over HDMI, and games with proper support like HZD or GoW look amazing in HDR. Windows 11 recognizes the device as supported in the HDR menu as well, so not sure what's happening. Games I tried are Witcher 3, Dark Souls 1/2/3, Hades, Dying Light 1/2, WoW, Mankind Divided, etc. Only in WoW and Dying Light 1 does it seem to be working relatively well, but the blacks and contrast are still a bit higher than they should be.
  4. I'm referring to the taskbar right click on the sound menu, "speaker setup". It always defaults back to 5.1 instead of the stereo setting, and it does it whenever I launch a game or a play a movie that changes aspect ratio, or anything that forces the HDMI signal to "reset' for a second. I'm only using a TV as a monitor and no external audio devices, so when it changes to 5.1 I get lower volume and I can't hear voices in games when characters are behind me for example. Is there a way to force it to stay on stereo and remove 5.1 option completely? I don't even know why 5.1 is an option there, and trying to change it to stereo in the control panel/sound/playback also resets back all the time like the taskbar change.
  5. The reason I'm trying to do this is because some games like GTA V, immediately jump to the highest resolution it seems and I'm getting back bars during cutscenes even when ingame res is set to 3840x2160. On my old 1080 Ti, this was never a problem on this same display, but since I switched to a 3080 a few months back, every time I remove this resolution using CRU my control panel loses almost all settings menus. They completely disappear from the left menu so I can't change global or individual game settings anymore. Did anyone else face this problem?
  6. It's never the same, can happen in any game and then later run normally in that game. Far Cry New Dawn and Red Dead 2 for example often do this, RDR especially will stay near max usage but the clocks will be almost 100mhz lower than in other games for no reason, with the same temperature and power draw it just clocks lower and performance then suffers.
  7. My 3080 started going into this weird power saving mode some times when I start a game, where it will aggressively try to keep the clocks and power as low as possible all the time, causing stutter when it has to adjust them which is all the time. This started happening a few days ago and I don't remember changing anything that might've caused it, I also updated drivers today to the latest ones with "clean install" and it's still happening. To be clear, it used to do the same thing before to save power as it should when a game is not demanding, but now when it happens it's much more aggressive and actively harms performance. I also tried the "best performance" power mode enabled in system settings if that even does anything for the GPU, and I have to specify the games to perefer maximum performance in the control panel to avoid this behavior. However, not even that is a guarantee as even for games with it enabled, it will sometimes start the game and still try to downclock although I think it's a little less aggressive if "max performance" setting is enabled for that game. It also happens if I set up an OC profile in Afterburner, I'll play one game for an hour with normal OC clocks and behavior (no aggressive downclocking unless the scene is really light), then I'll boot up another game later and it will start doing it again for no reason. It's not a problem with any particular game as it can happen in any of them, and not a matter of temps as the airflow is good with a Lancool 2 Mesh, GPU never going above 60c at 100% while gaming when it's behaving normally. I've yet to try DDU but I wanted to see if anyone has any idea why it's doing this?
  8. I can't find any info on whether newer DLSS dll files for manual updates can downgrade picture quality in some games or cause visual glitches etc. I am asking because when trying to do in Control, I think I noticed more overall blurriness than the game's native implementation of DLSS but it could be heavy placebo at work. I know in Death Stranding it fixes lots of ghosting issues but it's there's not as much information for every game where this is possible. Is there a definitive answer or can it break stuff in some games at random?
  9. Offset is more appropriate here yeah. The 3080 is the end goal for 4k gaming, and I can afford it now but it would definitely be breaking the bank lol. The only reason I'm considering a 3070 is that in my area at least, 3070 went from some 1500e a few months back to 890e now for a non-LHR version, but the 3080 went from 2200e+ to 1250e (LHR). So while it's a similar 70%ish dropoff, the 3080 got much cheaper comparatively, and I'm thinking that if prices continue to drop until the end of the year at the same tempo, a 3080 will gain more value than the 3070 will at that point. But there's no guarantee that the prices will keep falling and that mining profitability won't drop by the end of the year so you may be right that it's better to grab one now...
  10. Thanks, what I'm particularly wondering is, are LHR GPU profits more volatile than non-LHR ones since those rely primarily on Eth? As in, would in this case a regular 3070 be a safer bet to mine on for a few months. I know it's impossible to say for sure but any estimate would help a lot.
  11. I've been waiting for a chance at a GPU for over a year now like many people, and in my area it just recently became possible. The GPU will be used for gaming and personal use, but since the prices are still 70%+ over MSRP, I do plan to mine on it for as long as possible to recover some of the cost. I'm thinking of getting a Gaming X Trio 3080 LHR, available for 1250eur atm, but I'm really confused about how profitable it is right now. I've only been mining casually on my 1080 Ti using Nicehash/Dagger, so I don't know how other mining programs/algorithms work. Also, I can sell my 1080 Ti for 450eur whereas it was selling for 650+ 2 months ago, which is why I'm considering doing all of this now and to start mining with stronger cards while it's profitable. On Nvidia RTX 3080 LHR mining calculator | minerstat, the top results are pretty solid with 6.17eur daily profit on the Tent/Equihash (no clue what it is lol), and the Nicehash calculator estimates 3.86eur daily using Kawpow algo, which is far less and with a huge power use but again, more than I thought an LHR card would make. Now, how realible and realistic are these measurements? Especially minerstat since even the second best options estimates an ok 4.24eur a day at 230w. I have used Nicehash so I know how that works but what about these other methods, would it be safe to assume their estimations are correct for the next 3,4 months at least? The other option I'm considering is getting a regular 3070 now for 890eur, then use it to mine for about 4 months, and resell it for a 3080 then when hopefully, prices start favoring 3080 a bit more. Right now they are both at a bit over 70% markup, but the 3080 is obviously disproportionately more expensive considering their MSRPs. Should I do this or go for a 3080 LHR right now, and how much of a certainty it is that I can get close to Minerstat profits from it? My goal is to get a 3080 eventually, but not sure if I should go for it right now or is it more profitable to get a 3070 first until prices stabilize further.
  12. Thanks I'll give these a try and see what comes up. I'll try to get it done with MPC + madvr first if possible as I've already have a lot set up before moving on to another player, but I'll try out mpv if this doesn't work. Do you know what might be causing ocassional hitching during video playback on MPC + madvr? It mostly happens with 4k files and is the most noticeable during panning shots, it kinda looks like a dropped frame but the statistics don't show any. Is any madvr section in charge of playing the file smoothly or is it all on video decoder in MPC settings?
  13. Thanks but I was thinking about what would be the best of the options I had available, so TV itself and its OS and PC. I thought about getting the Shield but it's over 200€ in my area and I don't need the majority of the stuff it offers over the TV, especially since I'm using it as a monitor as well. The gigabit connection would be the biggest benefit of the Shield in my case as the TV is limited to 100mbs, so the ethernet connection is roughly the same as 150mbs wifi as it's more consistant, but neither is cutting it for very high-bitrate files. If I was using the TV as a TV more often then I would've gotten the Shield by now probably, but as it is it's too much money for minimal benefit sadly.
  14. I'm trying to figure out which of these to use to play movies and shows at the highest picture quality available while upscaling, but without excessive GPU usage. I have a 1080 Ti so power isn't the problem, I just don't want to keep the GPU at gaming temperatures and noise levels while watching videos. This has happened while I was trying out a madvr settings.bin preset for highest settings that I found here and I got 80% GPU usage while playing 480p files without a huge visual benefit that I could see. I'm using the Sony x900f TV as a monitor, which has a pretty good upscaler from what I know, but what I'm trying to figure out is would it be better to play the files via Windows and MPC/madvr or just stream them to TV apps like Plex and Kodi. As for the files, I'm mostly watching either 4k HDR, 1080p, or stuff like anime at 480p. I'm mostly interested in the performance of both 4k high bitrate files as I can't stream these to TV apps when they go over 60-70mpbs without buffering, and also 480p upscaling as I'm not sure how best to do it. I can't eyeball the difference unfortunately as I only have the one display so I can't compare it side by side, but here's the impression I've gotten from trying these methods out: Plex TV app - smoothest playback with refresh rate switching enabled out of all of them, 4k HDR high bitrate runs great when it's running but it's buffering often with a 150mbs internet connection on high bitrate files. Upscaling 1080p is great, 480p also good but not sure if madvr can do it better? Can't show stylized subtitles; Kodi TV app with Plex addon - stylized subtitles are shown but video playback isn't as smooth as the Plex app. I enabled frame rate switching but there's still some hitching during panning shots etc. It's also clunky and inconvenient opening one TV app then waiting for it to open an addon etc; MPC-HC + madVR - 4k files can hitch from time to time with any decoder settings I've tried, it seems to be the smoothest with DXVA (copy-back) but there's still an ocassional hitch that's very annoying. I also have to run specific display modes like 2160p24 etc to remove judder at 4k 60hz desktop, and as for upscaling settings I'm not sure which to use, I'm switching between NGU Standard/Sharp/Anti-aliasing etc and I can't tell much of a difference. With this I'd like to avoid high GPU usage for silence and there's too many parameters to customize that I have no idea what each is doing. Another downside is that I can't use my TV remote to control playback. Is it possible to get completely smooth playback with madVR (maybe with another program other than MPC-HC?) regardless of the file bitrate and can upscaling be any better than in the native TV apps without using extreme GPU resources and therefore, producing heat and noise?
  15. All looks normal under Performance. The hitches outside Chrome aren't really extreme but they're there and are noticeable, for websites it behaves kinda like when hardware acceleration is disabled, Edge too. As for Chrome it just hangs after a while during normal actions and regardless of how many tabs are open, but it's not happening all the time so I'll check out Task Manager for Chrome lag specifically when it happens.
  16. Really not sure if this is a hardware or an OS issue but a few days ago I randomly started getting hitches or stutters in random apps, mostly video playback and browser usage. Youtube videos stutter for half a second every few seconds on both Chrome and Edge, mostly at 4k but at 1080p too, and Chrome is often getting incredibly sluggish and becomes almost unusable for no reason. Locally played video files do the same thing through MPC-HC with madVR which didn't use to happen before, and Microsoft Word can also take 5-7 seconds to close. Video playback and scrolling through websites is jerky and hitches, regardless of the website. This is all on a usually quick system that didn't have any of these issues, 1080 Ti + 5600x, 16GB ddr4 3200mhz, and all run on SSDs. There was a Win update recently but I'm not sure if that coincided with all these sudden problems or did they start before or after for some other reason. Nvidia drivers are on the 466.75 right now. Does anyone know what can be causing all of this?
  17. I just got the Lancool 2 Mesh RGB and am really loving it so far, but I was wondering about a few things when it comes to maximizing airflow, would be ideal to get some before and after experiences if anyone has them about this. The first thing is whether I should replace the 3 front 120mm argb fans for 2 140mm ones, probably Arctic P14 pmw, and whether that would have a noticeable impact. The bottom 120mm fan looks partially constricted with the PSU shroud area, so I'm guessing the 2 140mm ones would do better overall but wanted to check if someone has tried it before buying. Secondly, is the top front exhaust fan counterproductive in this case? I have 2 P14s on the top as exhaust, and one P12 in the back atm, the front top exhaust fan is blowing out cold air when I put my hand over it, while the back top blows out warm air. Does this mean that the front fan is robbing components of some cold air or would it exit the case through that opening anyway if there was no fan there? The final question is whether it's worth putting 2 120mm fans on the PSU shroud and how much of an improvement that can make. My priority is GPU temperature since that's what's under load most often, and while it's already better than any case I've had before I'm trying to make sure if it's possible to improve it any further.
  18. Not really sure if I should post this here or the GPU section but here's the problem. I'm using the Sony x900f TV as a monitor with a 1080 Ti, and today I hooked up another monitor as a secondary screen over DP, but now that I've reverted back to the TV and unplugged the second monitor I started getting screen tearing on the desktop and all colors look overexposed, like a fake reshade preset or something. I'm not sure if this caused the problem but nothing else happened in between. What might have been the problem is that one display was 4k60 and the other was 1440p75hz. I can't take a picture of the issue as I have no reference of how it was before, and it's subtle but still visible and annoying. Everything gets brighter than it should be and bright colors are clipping, it's visible even with desktop folder icons. There is a visible shift in colors/contrast or whatever is changing that happens at random. Another problem which also might be the same thing, is that when running games in HDR, when that "shift" happens, HDR either gets a purple sheen to it or contrast is all wrong and blacks are crushed with a lot of color banding. Whites get a purple hue and it just looks bad. I also started getting screen tearing while scrolling through websites with hardware acceleration off, which I've never had before. I've had HA disabled when I'm running Nicehash occassionally as hardware accelaration on Chrome/Edge uses enough resources to affect the speeds, but I've never had tearing prior to today. I've since reinstalled drivers with DDU with 2 different driver versions but am still getting the same thing. I also tried a different HDMI cable, plugging back only the monitor again via DP, and all the problems persist. Any ideas on what's happening? Did my GPU start dying all of a sudden?
  19. That's MicroLED, Samsung just announced a new TV with it for the bargain price of $156,000, so it probably won't be coming to monitor panels for anything close to affordable in god knows how long. Mini LED is far less impressive but should still be an improvement in terms of dimming zones, as it's the same LED tech just much smaller modules, so more control over dimming. Will still be more expensive than the old one for sure sadly, idk why they can't just release the old model but flat and with all the issues sorted out.
  20. I was looking at that one yesterday but saw that a lot of people on reddit complained about smearing everywhere. What type of games/content did you see the smearing on that model? I'm guessing that most people who are comparing VA smearing are usually playing fast paced esports games so I never know how much value to put into smearing comments as I never play those. Then again my old Samsung TV that I briefly used as a monitor had atrocious ghosting even while scrolling through black text on white websites...
  21. Thanks for the recommandations! The MAG274QRF looks like the best all around affordable IPS so far if VAs are out of the question, which looks like it. Which do you think is better between it and Gigabyte M27Q? Going by Rtings they are pretty close with MSI having better uniformity but much less accurate colors, though not sure how much uniformity/clouding can vary between units. Both are 440-460€ in my area. As for the LG 27GL850-B it looks worse than both going by this comparison unless I'm missing something. I just don't understand their local dimming segment which I know is there for reference but, between these 2 the LG has worse contrast and black uniformity, but comparing their local dimming reference videos it has much better blacks. Are those videos even from the specific monitors they are reviewing?
  22. I would gladly but nearly all the monitors available in the local stores are the 100-200 euros 1080p60 lowest end panels, since that's what most people buy. Same with TVs, not a single model is on display that's above entry level.
  23. That's the sadness I wanted to cure with a 1440p monitor at least so that my current GPU can hit 60 in modern titles but...here we are
  24. So I've been using the Sony x900f TV as a monitor for a while now and was hoping to upgrade my 1080 Ti with a 3080 as soon as it was announced to keep gaming at 4k, but since that won't be happening anytime soon I started looking at 1440p gsync (compatible) monitors instead. As someone who primarily plays single-player games, 144hz is the most I'll ever need, and I'm not sure what response time I should aim for as I don't know how much more pronounced it is at 144fps compared to 60. For reference, this TV has a 10ms 100%, 3ms 80% response time and I never notice ghosting at 60 fps, wheareas it was a constant eyeseore on the old Samsung NU7100 at 9,5/24ms. From what I understand ghosting is more visible at 144hz so it needs a faster response time to remain ghost-free? TN monitors are a hard no, and with IPS as much as I hate their contrast, what's killing me more is that basically every panel I've seen has some degree of bleed/ips glow that's massively distracting to me. The problem with this is I'll have to buy from a reseller as stores have atrocious prices in my area as well as terrible selection, so I can't keep buying and returning unless there are some serious issues (backlight bleed probably won't apply). As I vastly prefer VA when ghosting isn't noticeable, Odyssey G7 seemed like a perfect monitor but the curve is hugely unappealing, especially at 27, and they are currently 570/640 euros for 27/32 models respectively in my area, which is more than the maximum 450€ I'd be willing to spend for a monitor right now. I've been looking through Rtings reviews and I can't find a single VA alternative that doesn't have some other glaring flaw and smearing, or an IPS monitor that doesn't have terrible black uniformity with clouding etc. Are there any decent cheaper alternatives for G7 or IPS monitors with a bit better QC in terms of bleed? Like I said I don't play competitive games so visuals are a priority in terms of contrast, relative color accuracy (as in not having extremely oversaturated or noticeably inaccurate colors), good gsync (freesync) to avoid input lag and tearing. I've gotten really used to the huge screen sizes so trying to find a 32 inch one with decent text clarity, though 27 is fine as well.
×