Jump to content

atomicus

Member
  • Posts

    944
  • Joined

  • Last visited

Everything posted by atomicus

  1. It's a bit misleading with 3840x1600 because those panels are 38", so the PPI is actually the same as 34" 3440x1440. If you compare a 38" 3840 x 1600 to a 32" 4K screen there is a very noitceable difference in the sharpness. Not to take anything away from 38" mind you, it's very nice. The fastest refresh you'll get on one of those is 75Hz on the Acer XR38 which also has Freesync... and basic HDR I think on the latest version, but it's bare minimum requirement just to tick the box. Acer Predator XB321HK is STILL the only 32" 4K monitor with G-Sync (there is 27" version of this too), but it's nearly 3 years old now. Nice panel though, IPS true 10-bit, and despite only being 60Hz, G-Sync keeps it nice and smooth for games. There are a few 4K Freesync monitors around (again all 60Hz), but good luck running that maxed out with AMD potatos. You really need a 2080Ti to assure a solid 60FPS at 4K if you want to run ultra settings, and even then G-Sync is a nice thing to have... so that results in a very short list annoyingly. The stupidly priced Acer/Asus 27" 4K HDR 144Hz monitors are just a joke really... total halo products for the rich and foolish. Not that I'd discourage people from buying them... we depend on your "must have new shiny shiny" attitudes so that manufacturers realise there is a market and can make them cheaper for the rest of us. The 21:9 HDR QD 200Hz screens look interesting, but VA is always bad for ghosting and footage of these monitors in action indicates this might be an issue unless they can fix it. Usually these problems are inherent to VA panel tech though, and while it always varies game to game, I am not feeling too confident with these. We shall see, but given how extortionately priced they are going to be, I'd want near perfection! HDR is still a joke on PC's anyway, so that in and of itself is mostly a marketing gimmick and nothing to get excited about just yet. It's here to stay though, that's for sure, and there will come a time when HDR is worth it, but that time isn't now. If you want to experience HDR done right, get a console and 1000-nit HDR TV. Personally, I'd be happy with a BLB/glow free, uniform colour accurate IPS 32" 144Hz 4K G-Sync monitor. IPS has no ghosting or blur like VA, it's only IPS glow which lets it down with many panels.
  2. Plenty of people have 2080's. 2080Ti's less so given the shortage. Perhaps most 2080 owners aren't at 4K though, who knows. But there are still a fair few. Benchmarks answer a different question... how they compare at the same settings, not equalising for FPS. Of course it's going to vary person to person, and also game to game... that goes without saying... but an opinion of someone who's experienced it is better than no one at all and at least goes a small % of the way to offering some insight.
  3. @iLostMyXbox21 Yes, I am well aware of the benchmarks. and that the 2080Ti beats the 2080 (obviously). That's not what I wanted to know. I wanted to know ACTUAL REAL WORLD experiences with the 2080 at 4K and how people are finding it. Are you having to turn down settings in certain games to achieve smooth enough FPS (if so what games), and if so is this acceptable given the graphical compromise that may need to be made... or at 4K is that compromise not really evident?
  4. Seems impossible to start a conversation about the 2080 without it inciting hate and accusations of it being a pointless card and "buy 1080Ti instead", but I will try regardless... All I really want to know is if anyone has one and is gaming at 4K, and what your experiences are? Are you having to turn down details to get comfortable FPS, and if so are you happy with the graphical compromise (if any) that you're having to make? I know the 2080Ti is the true 4K gaming card, but I'm ONLY curious about the 2080 and user's actual experiences with it.
  5. These must be delayed significantly, just as every other HDR high refresh monitor that's been before. They'd want to be announcing these soon otherwise, so the reason they've said nothing must be due to production issues/delays of some sort. We won't be seeing these in 2018 now I'm sure, Q1 2019 if we're lucky but more likely Q2/Q3 I think. Monitor tech moves at a snail's pace.
  6. Unfortunately that is the cost of G-Sync. But ultimately, the monitor is your connection the experience and what you've paid all that money to enjoy... it's the most important aspect of any build and is what will determine your satisfaction of it. You don't need 1440p G-Sync though, you can just get 1080p G-Sync.... if you're only gaming then this is more than fine. Productivity is definitely better with 1440p though, but it is more demanding in games, so if you want to max out your FPS, then 1080p might suit you better.
  7. As I say, it will entirely depend on how good it is and also the consumer/market reaction. There will be a significant number of gamers with RTX cards... when Pascal stock dries up and becomes EOL, which it will in the coming months, the 2080 and 2070 will become more dominant, especially if the prices are dropped. There are already 25 games due for DLSS support, and if it works well and boosts sales of both those games and RTX GPUs, then you can bet other devs are going to be jumping on that bandwagon. This is NOT gameworks, hairworks or anything that simply offers cosmetic improvements or enhancements at the expense of performance. DLSS is specifically designed to boost frame rates and make games look better and run faster (although as previously stated, at native 4K it won't necessarily look better, it will just run faster). It's unprecedented tech and has huge potential. And SLI is possibly making a comeback with NVLink... already benchmarks with the RTX cards show impressive scaling with it, more so than SLI anyway. Time will tell on that one also.
  8. Yes, but also in titles where the 2080 could struggle slightly at 4K (and it will on occassion), DLSS could help get the frame rate up. And as you say, in ray tracing I think it's going to be vital, but we'll be waiting a while to find out on that front.
  9. No I don't think it will... I am not particularly excited about ray tracing, but who knows. This tech is so new, and devs have had no time to get their teeth sunk in to it. But I won't hold my breath. I think the best we'll get this generation is some semi-integration of ray traced elements in scenes with DLSS helping massively. DLSS alone is worth getitng more excited about. I don't really see the logic of buying a new 1080Ti now though. There's zero future proofing and while it's a great card, it's not getting any better. If I already had one, I'd be keeping it, but if I were getting a new card the 2080 would be the obvious choice if I couldn't afford a 2080Ti. It's got room to grow and as drivers mature, nevermind DLSS potential, there's NO question that the difference between the 2080 and 1080Ti is only going in one direction.
  10. I can't speak to your experience with it, but to my eyes, in the low FPS dips it's definitely noticeable when it's not there. Last few monitors I've had have been G-Sync, but when I've played on other monitors that aren't, I notice it. It's not an ever present thing, but it's there. As to whether it's worth it, that really depends on the user and what games they play, and how sensitive they are to these things. All I can say is that I notice and miss it when it's not there. Besides, I have a G-Sync monitor at present anyway. G-Sync tax sucks though and I'd be quite happy if Freesync took over, or they merged somehow. The success of DLSS is certainly dependent on developer support, but the initial take-up already looks strong and many devs have been talking the technology up, recognising its potential. The biggest challenge moving forwards, as with any new bleeding edge tech, is that it only benefits an initially small sector of the market. The potential benefits for gaming are clearly evident though. Yes it does require pre-calculation, but that doesn't mean it's only going to end up in a few titles... especially if it REALLY works! It's a win win for everyone, so the incentive is there across the board.
  11. What's generally not worth it? DLSS? We don't know yet, and there's more than a handful of games that have been announced already that will be supporting it. And how good it ends up being will determine how many more there might be... we obviously can't say how effective it will be at this stage, but it does hold a lot of potential.
  12. Today, I don't think that's necessarily a bad decision depending on what games you play, but looking ahead DLSS could see the 2080 take a significant lead in those games that support it. I know this is uncertain yet, and there is an element of buying on hope here, but at the end of the day the 1080Ti isn't going to get any better... the 2080 is, just on driver maturation alone. NVLink scaling also would appear to be far better than traditional SLI, should that be something people are considering down the line. DLSS holds great potential, and it's more reason to be excited about these cards than ray tracing. In a few months time, there is a chance we see a lot of people with buyer's remorse who have gone out and bought a full price 1080Ti in recent days, angered by the 2080 pricing. I would argue that if you already have a 1080Ti, keep it... but if you're in the market for a NEW card that you intend to keep for a long time, the 2080 might just make more sense, depending on your use case, resolution and games that you play today and intend to play in the future. Regardless, this is another discussion altogether and not what I was seeking information on. I merely wanted to know if anyone who has already bought a 2080 is using it with a 4K G-Sync monitor.
  13. The reason I ask is that I already have 4K G-Sync monitor... the 32" XB321HK, which I've had for a while. It's a great monitor, I love it, and it cost me less than that one you've linked.
  14. Is anyone using an RTX 2080 with a 4K G-Sync monitor? This is one combo which doesn't seem to have been at all discussed in reviews, and certainly would (in theory) make the 4K experience with the 2080 that bit smoother and playable in those games where it's perhaps not hitting the FPS numbers we would like. Curious if anyone actually has this combo and can comment?
  15. No need to apologise, I've found all the info discussed very informative.
  16. In his defence, I did state in my initial post I was looking at a small test-bench set-up, but I've since moved away from that idea and am focusing more on an SFF build... although the Ophion case is a bit bigger than some might consider true SFF. I don't think they are releasing a blower 2080/2080Ti.
  17. Just seems amazing to me I could get away with 450W... I never thought that would be possible so it's tricky wrapping my head around it lol!
  18. Is there a point during gaming or other tasks (not benchmarks) that both the CPU and GPU would be drawing their max simultaneously... i.e 152W for the 2700X and 260W for the GPU (2080Ti)? I've not studied that 2700X review, but is that 152W factoring in the boost? I am not OC'ing my CPU as there is largely no point doing so with the 2700X, unless for hobby/benchmark reasons, which I have no interest in. Single core Precision Boost to 4.3GHz does a better job in gaming than a manual OC would achieve.
  19. If the 450W Nightjar is enough for my set-up, than that is definitely appealing. My only concern is if it is indeed enough. I have x2 SSDs and x1 M.2, plus as mentioned a 2700X, 32GB RAM and potentially a 2080Ti further down the line. If I do have an AIO that does add a touch more. Regards case, I do want a TG side panel, so that limits selection far more. I don't need it to be 'tiny', I just need it to be a bit more manageable and lighter than a full size ATX, as I move it from room to room in my house often. Never had an issue with AIOs personally, and I've done a lot of custom loops and again never had any leaks there. I'd even consider custom cooling the CPU in an ITX build in the right case (I think adding in GPU to a loop would be a bit tricky), but again I'd still want a TG side panel. The Ophion ticks a lot of boxes and at an attractive price. There are a couple of rather nice ITX cases in development (which also feature TG and are smaller), but they look to be 3 times the price of the Ophion. Not that I'm in any urgent rush here.
  20. Love the look of it, but I think that 450W Silverstone would be a bit underpowered for what I'm looking to run? Yes, always a compromise. There are some cases I'm looking at with rad support for CPU though (Raijintek Ophion Evo for example), as that CPU clearance height is a big issue for sure. I do need some portability with my system though, so I am forced to go down the smaller route. I don't need it to be Raven size though. The smaller the PSU the better, but it doesn't technically HAVE to be SFX. The Corsair ones I've realised are too big at 200mm. Even the Seasonic might be tight in some cases at 170mm, but that is going to be silent for sure. The Seasonic Prime Ultra (Platinum and new Snow Silent) are 140 mm x 150 mm x 86 mm, which seems quite compact. From Cybenetics test, this seems to perform rather well, and Jonny Guru rated it very high. Unfortunately those Corsairs are just TOO big for the cases I'm looking at. Even the ITX cases that take ATX PSU's won't take those. I'm not actually bothered so much about cost, so the Fanless Seasonic isn't off the table.
  21. Interesting info there, thanks. Not a major improvement over the original SF600 it would appear. No. not completely passive, I am just seeking to reduce noise wherever I can, as much as I can.
  22. Interesting on the Corsairs... never even considered them due to a bad experience I had with one years ago. Will take a closer look. The more I think about it, I am liking the idea of SFX though, just due to space and weight saving (I have an ITX board, no HDDs), but I'm aware there aren't really any silent options... yet. The SF600 Platinum purports to be up to about 30% supposedly, but it's not out yet. I'll keep an eye on that as well. To be honest, if I was looking to spend HX1000i money, I'd probably be more inclined to look at the Seasonic Prime Fanless 600W which is about the same price and gets excellent reviews wherever you look.
  23. I am not overclocking my CPU... there is no point with the 2700X, best to just let boost do its job. As for the 20xx series, hard to say what benefit will be had from OC'ing, but allowing for the 280W max as you've done should see me OK... I hope!
  24. Yes, I know Corsair do them for the SF600, and are bundling them with the SF600 Platinum. I may wait for that, as I would like the lightest smallest package possible... and if 600W is enough to run my system then I don't need to go full ATX. The Platinum version seems like it will be worth waiting for over the current Gold rated, given it's new PWM fan config.
×