Jump to content

Zodwraith

Member
  • Posts

    14
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Profile Information

  • Location
    Las Vegas

System

  • CPU
    4790k @ 4.6ghz
  • GPU
    1080ti SC
  1. Did the video get taken down? I never got a notification and show nothing on youtube for it between bottleneck and I was right videos. I'd like to at least take the quizlet for when more come in. Is it Float Plane only?
  2. I'd more interested if it's at AIO "MSRPs" vs Nvidia's because they're already so ridiculously inflated this is a non-starter. Asus own Amazon store has all their cards listed at 220% original MSRPs, just out of stock.
  3. In the years since they surfaced I've yet to have a bad experience with evga. I have no idea how they are behind the scenes but I've had enough great cards from them that I automatically go to them for every GPU. This just makes me trust them a little more. Some companies wouldn't give an upgrade replacement to a 2nd owner after mounting a water block. Did they give you any grief over that or did they swap it no questions?
  4. While I don't think they charge whatever they want I will point out the 1080ti really was too good for it's generation but consumers were scoffing at $750-800 flagship GPUs at the time. It already WAS $100 more than the 980ti. I don't know how old you are but enthusiasts weren't thrilled with it then either. You seem to forget the Bitcoin debacle shifted everything up in price after that. Even entry level cards. Used 1080tis were going for >$1k. Bitcoin ended but prices only partially recovered. People had adjusted to ridiculous prices so they were happy to pay for high prices vs the year of straight rape gamers endured. Enter RTX at post-bitcoin prices and that 1080ti shows just how "too good" it was. It was so good even with masses of mining farms dumping them on ebay they still held MSRP value while being used vs RTX pricing. Even NOW they demand $500 used with Ampere on the horizon. So if you do the math they WERE charging more because there was no competition. It just didn't stand out as badly because bitcoin hadn't shifted everything to the point your flagship pierced that magic 4 figures. Saying they weren't gouging because AMD couldn't match it 2 years later makes no sense. They can't see 2 years in the future. That only shows how bad AMD failed at making competitive GPUs. RTX only proved even Nvidia struggled to beat the 1080ti so you can only blame AMD so much.
  5. Then I wouldn't upgrade at all. Your build is balanced if aging. We all get that upgrade bug now and then when something comes along to remind us gaming rigs age worse than dogs. Mine was RDR2 and Rockstar's lack of optimization for anything that doesn't begin with "AMD". If you HAVE to scratch that itch somewhere grab a big, affordable 4k panel. Outside of gaming that high rez real estate pays off BIG and the 980ti will have no problems pushing it. Good names like Samsung and LG have 50" HDR panels for as low as $350 and that will definitely FEEL like a nice upgrade the first time you browse the web in UHD. Then they can run 1080p at 120hz when you fire up a game once in a while. (You know you want to see Farcry at least once in glorious 4k) When the upgrade bug bites again because the GPU is feeling too slow for even 1080p you already have a piece of your puzzle whether your upgrade path is a new GPU or plugging a console into it. TVs make more sense than monitors if you're not gaming competitively. You get bigger, sharper results for 1/3 the cost and modern TVs have better response times than yesteryear's monitors. Worst case you have a solid 4ktv that you can put to use anywhere down the road.
  6. Let's not give intel any ideas on yet another platform change.
  7. When I upgrade my old parts hand down to wife's/kids' computers. Keeping 4 PCs current is why I can't afford to buy the latest & greatest every cycle. That's on top of phones, tablets, cars and clothes. I think you're right about multi card GPUs not being utilized anymore but multi chip GPUs are inevitable. Especially if panel manufacturers can brainwash the public that 8k is necessary in any way/shape/form.
  8. Decent name 50" HDRs like Samsung and LG are as cheap as $350. OLED isn't necessary but nice. Not worth tripling the price IMHO. When I'm doing a build for someone the one place I tell them never to skimp is KB/Mouse/Monitor. If they don't understand what's going on inside the magic box choose the pieces they have to touch and see with care. Go hit Fry's and touch everything. I used to roll my eyes at "cAn I gEt RGB?!?!?!" but if that excites someone about their computer more power to them. I'll build the tackiest crap you've ever seen. My daughter's PC looks like a unicorn puked across her desk. I'm the only one in the house that cares about what's inside the PCs and I won't berate someone for using a TV instead of a monitor, especially now that their response times have gotten far better than early LCDs.
  9. What generation did you try it? I just realized my first VooDoo2 SLI was 1998. Guess that dates me. The VooDoo2 was awesome, the 8800GTs were awesome, but the GTX 970s definitely showed SLI was losing support. Part of it was I just loved the tinkering aspect. We were building liquid cooling back around 2000 with a danger den block and an automotive tranny radiator and aquarium pumps. 8D This is where you guys get to say "OK boomer" and I get to reply "you kids wouldn't even have AIOs if it wasn't for us pushing the boundaries 25 years ago. Get off my lawn!"
  10. Just checked on that and yes, I remember hearing something about it a couple years ago and both camps are only experimenting with it with very little noise for a while. Their big hurdle? “You’re talking about doing CrossFire on a single package, the challenge is that unless we make it invisible to the ISVs [independent software vendors] you’re going to see the same sort of reluctance.” - David Wang, senior VP of Radeon Technologies Group. You know, someone that actually is in the industry. So I guess my point stands. SUPPORT is still the hurdle to multi GPU. If devs weren't reluctant to support it Nvidia or AMD would invest more into the problem of fatter pipes between GPUs. Heck, look how long it's taken devs to support more than 2 cores of a CPU. I still say it's got to happen eventually, but probably not until you see a console embrace it.
  11. THIS is what I always understood was the real problem that caused the stuttering. I understand the increased bandwidth in Nvidia's new bridge helps but doesn't negate it completely. Which only goes back to having proper support in creating a broader link between GPUs. Thank you for giving a well thought answer instead of a snide remark.
  12. Ok, please enlighten me. Because I would love to research something that has obviously gotten past me instead of just scoffing at me. I was under the misconception the forums were to help each other and spread ideas. Are GPU makers NOT working towards multi core as Acid said and I predicted?
  13. You're in a weird spot where I would probably suggest monitor/GPU close to each other. Super high refresh rates aren't the key in what you play (which is my thing too). You want to enjoy that world. You're going to get more enjoyment by bumping to 1440 or 4k to be immersed but that will make the 980 hiccup at 1440 and cry at 4k. At high rez your i7 is fine for at least a couple more years as you'll be GPU bound, and GPU/monitor are universal upgrades that can move platform to platform. As long as you jump a couple generations a GPU is never a BAD upgrade, just not always wise if you can't let it stretch it's legs. I'm sitting on a 1080ti right now and flustered my only solid upgrade is $1200+ as new titles are making me regret being spoiled on 4k for 2 years. 4k panels get funky at 1440 and 1080, while aligned with it's native rez, makes me feel like I'm playing N64. If you're a small monitor guy, go 1440. If you love a massive screen 4k pays you back. RDR2 is a good benchmark to pull from because it's a sign of future titles. It's that open world like you like, It's poorly optimized like many console ports are, but looks stunning at 4k. My only solid answer is even being an nvidia guy myself, at that price point, I would steer you towards the 5700xt. Open world is big on consoles and AMD is already confirmed with near identical hardware for the upcoming ones. Ray tracing is useless at that level and just a boondoggle.
  14. Wanting 4k performance without spending enough for a car lead me here. SLI was never a bad idea. It died because no one supported it. GPU makers gained 30-50% performance each gen so there was little reason for customers to embrace it, GPU makers wanted you buying their new cards, and game devs didn't want to waste resources optimizing configurations for single digit % customers. You got 50-70% boost at best, negative impact at worst. Double the power consumption and thermals. Stuttering in some games. Straight up crashes in others. But at least you had a stopgap measure you could employ to breathe new life into a system. Every single con in "why SLI died" is correct, but solved with support which never happened aside from thermals and power. Fast Forward to today. My how times have changed. Many core CPUs are now heralded as premium when they looked just as foolish before because they're finally getting SUPPORTED. GPU growth has stagnated and doubled in price for marginal performance boosts. Enthusiasts are under an Nvidia monopoly. Mid range cards are $4-500. The GPU market is in a sad state when the new 5700xt is AMD's flagship and applauded for only costing $450 but struggles to keep up with a 2.5 year old 1080ti. This is unacceptable. GPUs are where CPUs were when Intel was trying to push the P4 as hard as possible and pretend multi core was not the future. Twin 5700xts would make the 2080ti's price seem even more ludicrous and easily annihilate it if there was real support. I understand Nvidia's lack of support, they can point at their more profitable cards as your option. The higher you move up every stack you get diminishing returns for your money. AMD dropping CF entirely is completely stupid when they have ZERO options for enthusiasts. Should I be madder at Nvidia for price gouging but at least giving me an option or AMD for saying they don't even want my business? Imagine many lower power GPU cores spread across a large heat sink instead of one insanely hot one? It's all support, design, support and no one is even looking down that multi GPU street vs trying to keep paving the single GPU one that's climbing an increasingly steep and expensive hill. The anemic trickle of SLI support looks like it was tacked on by an intern 6 months after launch. Moore's law fell apart long ago but for the life of me I can't understand why people are blind to multi GPU while embracing multi CPU. I've ran SLI on VooDoo2s, 8800GTs, GTX 970s, and now I'm considering a 2nd 1080ti combing forums to see tricks and workarounds enthusiasts are employing to get better performance. There was at least mediocre support before with a lot of games seeing well above 50% boost and 2 good tier cards annihilated the flagship for similar cost. I have zero doubt as we hit physics and heat limitations multi GPU will return and finally be embraced and supported so us old school SLI/CF guys will be vindicated. Until that day enjoy that 2nd mortgage if you're an enthusiast because I'm not sure AMD is competent enough to compete or wouldn't charge just as much or more with how their current pricing is vs their own Polaris. Just my .02
×