Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


About GuruMeditationError

  • Title
  • Birthday 1974-02-25

Profile Information

  • Gender

Recent Profile Visitors

3,236 profile views
  1. Thanks for responding. Since posting I found this on youtube: GeForce RTX 2080 Ti Test - 1440p Performance - 30 Games Benchmark Review (Intel i7-9700K) 144Hz https://www.youtube.com/watch?v=XEAJ-1xuASk&t=3573s ...which is pretty helpful, but it's running with a 9700K. Sucks to have to stick with legacy hardware, but everything's relative I guess. I might actually end up having to buy a new 3D capable monitor (while they still make them) and leaving it in storage until I have hardware that can drive it.
  2. Hi. I've just edited the initial post, so bumping... ...what do people think?
  3. I have a couple of spare/backup legacy 1080p gaming monitors that I use for Nvidia 3D Vision and which require DVI for 3D. And B4: Yes, I know, 3D Vision is dead, 3D sucks, I'm an idiot for liking it, 1080p's dead, I should buy a new monitor, I'm an idiot, my butt smells, and I suck...fine...but that out of the way... My main gaming monitor for 3D is a (legacy) BenQ with a max refresh rate of 120Hz and which utilises Displayport for 3D For 3D Vision's lifespan, 1080p was the sweet-spot for 3D, as anything over 1080p required (at the time) very powerful hardware to get the necessary frame-rates. My build is as follows: i7 4790k Asus z97-WS (motherboard) 16GB Klevv Genuine 2666MHz DDR3 Asus ROG STRIX RTX 2070 Kingston 120GB V300 SATA 3 SSD The CPU & RAM are overclocked using Asus' onboard overclocking utility in the BIOS. With the above setup,FRAPS is reading 48 to 60 fps in Witcher 3 in 3D on ultra settings on my BenQ, via Displayport. As above, the spare/backup legacy gaming monitors I have, only support 120Hz over DVI (DVI-D to be more precise). Currently, the most powerful Nvidia card with a version that has a DVI port is the RTX 2070 and at 1080p the RTX 2070, only pulls ahead of the GTX 1080 Ti at 1080p when it's overclocked. See this for comparisons: https://www.gamersnexus.net/hwreviews/3377-evga-rtx-2070-black-review-overclocking-fps-temperature-noise. Basically, I have two questions: Firstly, will at least one of the next range of Nvidia graphics cards have a DVI port, and will it exceed the speed/power of the RTX 2070 / GTX 1080 Ti ? Another way to ask this might be: Will the Nvidia RTX 2160 or 2170 have DVI, and will it be more powerful than an RTX 2070 / GTX 1080 Ti ? Is this something we're going to see, or will DVI port cards die before exceeding the 1700Mhz clock speed? Secondly, I also wan't to move to a new 1440p monitor for 3D gaming as soon as possible: Will my CPU/RAM/Memory bottle neck 1440p at a 120-144Hz refresh rate, and which graphics card will I need? Is the RTX 2080 Ti powerful enough to drive 1440p at 120-144Hz, (120Hz minimum average FPS (for 60 to 70 fps in 3D)) ? Can I get by with an RTX 2080 for 1440p at 120-144Hz, or might I even have to wait for the next range of Nvidia cards?
  4. I guess a better question might be whether it's going to survive as a technology. If consumers can kill it, they'll probably do everything they can to kill it. Asking consumers to buy in to enthusiasm about a product is, I think (?), probably a step too far. The smart money is on them being cynical and disdainful about and towards it. I hadn't really looked too closely at it but I think ray tracing could very well go the way of 3D (it was almost like consumers wouldn't let 3D work, and were determined to put an end to it as soon as possible). I think the lowest common denominator is always going to win through; why I think VR is only going to succeed in the mainstream if the headsets are a cheaper option to a monitor, which I don't really see ever happening due to the higher production costs...a nice experiment, but here to stay? If they can make an affordable adaptor to make smart phones the screen for VR then maybe, but then you'd have to get people to use "widescreen" (longer) phones and good luck with that. It might be easier for developers and less work etc. but if the market won't buy ray tracing, then they'll abandon it, because the bottom line for the companies making these games is the bottom line, and the shareholders. They're legally obliged to do the best for their shareholders, and if that means following the market, that's what they'll do, even if the market are turning their back on ray tracing and walking away. For me the ultimate would be 3D G-Sync with ray tracing, but when you compare the image you could otherwise have by putting all of that computing power into the current standard rendering solutions, what are people going to pick...sadly, when it comes to the mainstream, the answer is: the cheaper option.
  5. Yeah, I don't think it's going to disappear at all. Also, you don't hear about PhysX any more because it's been integrated and it's no longer a selling point; it didn't go anywhere, it just became standard. Or at least that's how I understand it. I think Ray Tracing is totally going to become standardised and I totally believe it's the future for gaming; I personally think it's just a matter of how long, until more or less every surface you see in a video game is rendered using Ray Tracing. I wonder what the computational power needed for that would be however, and theoretically how long it'll be before it's possible.
  6. As per the title. At the moment only parts of the image in video games display ray tracing. What I'm wondering is how long it will take for the surfaces of all of the geometry in games to be rendered using ray tracing?
  7. @Princess Luna @jones177 @GoldenLag Thanks for the input guys. On reflection I'm going to just stick with my current build and see how long it lasts before I really have to upgrade...hopefully sometime relatively far off in the future. If the 3D modding scene drives an uptake of user implemented SLI I'll take a look at things then, but, for the meantime I'll be sticking with a single RTX 2070 at least for the lifespan of my current CPU, motherboard, RAM configuration. Thanks again guys,
  8. What board would you recommend? Also,would you recommend skipping or avoiding the 9900k? You don't seem to like it as a CPU?
  9. Yeah, I already picked up an RTX 2070 card because I had a 970 and needed more VRAM for Fallout 4 modding (and to try out Fallout 76) and I found a brand new Strix for £200 under the going rate. The guy wanted quick cash so I obliged and now I've got an RTX card. One benefit of running RTX cards also, is that when I do play the new games they do have the RTX effects, which I love the look of. As for planning ahead, I just kind of want to plot an upgrade path for myself to motivate me to try and get the cash together for all of it; and to figure out stuff like, whether I need a new PC case because of SLI heat issues; whether I need to pre-order to secure a flagship motherboard (because that's the one component I'd want to buy new, and maybe even pre-order, to ensure that I get one). That kind of thing.
  10. Thanks, that's good of you to clarify that. It's okay, it wasn't really harsh, it's just people keep telling me that every time I post about it, sometimes repeatedly (although admittedly some of that was trolling). But I understand what you mean now; I hadn't really looked at it that way, and it is appreciated now that I do. Also, I'm going to be buying the 2080ti's second hand when the prices drop, so it's not going to be a massive outlay. Re. the drivers and re. future games; there are people in the community that have written scripts that patch the 3D back into the new drivers and there are people working on converting new games all the time...the list's still growing. I've posted to ask if they might start working on SLI drivers once the system requirements get steeper. If it seems like something that might happen then I'll probably opt for an SLI config. As for games running better on architecture they were optimised for; I hadn't actually considered that. You think there'd definitely be that much of a difference that two 1080ti's would outperform two 2080ti's? Ultimately it's the newer titles that are just starting to be released and that are in the pipeline that I mainly want the 2080ti's for, so that might not be such an issue...the older stuff would have much more modest system requirements and could at worst probably be brute-forced into running well by the extra gpu processing power? Although that last part's pure assumption; more of a question really.
  11. I think you're right about that. I think for me the question is whether the price is worth paying. I'm definitely going to get at least one, at some point, when the prices come down and I can pick one up second hand on eBay, but as for SLI, I guess I need to ask the 3D community if the coders there might turn their hand to creating SLI profiles when the hardware requirements start to increase beyond the capabilities of the 20 series cards.
  12. Okay, thanks, that was going to be my next question. And you're sure about that? Is that the general consensus or do you have first hand experience of this? I only have a mid-tower case so, I'm limited to as to the number of fans I can fit.
  13. Also, interested in hearing your thoughts on this. I thought conventional wisdom would suggest they'd get quite hot.
  14. I think I'm just looking for a general guideline, rather than a list of components and where to buy them? I almost posted this as a hypothetical question, and in many ways it is, because it's still at a very early planning stage (why I'm trying to do the research) and is in a state of flux. In fact, stuff you're telling me is actually causing me to question whether I should go SLI at all. I don't mean to frustrate you but I can't give you solid figures or a definitive time-frame, but I do appreciate your willingness to lend assistance and I think the input you've given me so far was at least worth posting the thread for. Thanks so much, I'm just sorry I can't give you any definitive answers regarding this. But thanks anyway. Yes, I do know that: the only place I can post about 3D gaming without that being the immediate response is at the 3D Vision forum. As for the money, ultimately I'll be buying the 2080ti's second hand a few years from now, when they come down in price; I was just trying to get a general idea of the PC spec I'd need to support them. Thanks for the input, it's most appreciated.