Jump to content

Dem0072

Member
  • Posts

    17
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Dem0072's Achievements

  1. I'm greatly considering a used 1080 or 2070 when the time comes. I love a good game, but I don't get enough hours into it to justify coughing up a $600 card. Hence why I'm cheap and adopting late. I had an X800XT Platinum when I was 14, I did the fancy case with the elite flagship card thing and watched it get crushed 8 months later by the GeForce 7000 family. Besides we don't live in 2004 anymore where every games a polished product off the shelf as a standard. It's hard to find good games that developers actually finish and get the crap out of pre-alpha before the users get bored with them. And sadly, too often the engines and visual tech is 6+ years old when they finally get released. What do I need with a $600 card to run games made by an industry churning out titles that are half a decade old under the hood?
  2. There's always the hope AMD will scare them into an early release, or a modded version of an existing card at a special price point to keep them in check. Even the perception of a threat would be enough to greatly help my investment
  3. If it's intensive (like All the Mods) probably not. But you're talking to the wrong person for intensive MC mods. I'm one of those irritated fans that just wants them to bring the old Tekkit Lite pack of like 90 mods (instead of the bloated packs with over 200 of them) up to a MC core version of like 1.7 or 1.12. I have no problems with the mods, I just find the old core to be boring with biome content and NPCs. Minecraft *can* use beyond 8GB for client, and for server. If someone wants to go there, they can. I just don't find it to be as rewarding to do so with how good Tekkit Lite's been to me.
  4. I pity your soul on the VAT and high cost of parts. And yeah that would be a sensible move if you're already at 4c/8t, which handles 1080p gaming just fine with a good card. I've been pushed too in figuring out where that exact value point is on smoothly feeding an RTX 2070 or other competent 1440p capable card. I'd be interested of seeing your results in super sampling the 1080. I may end up in a similar boat this year. I really want 1440p 144Hz displays to come down in price and a card under $300 that can handle them at 60FPS.
  5. I got fast back before VOIP was a thing in game, and you get your message out before you get shot.
  6. To my understanding the RX 500 family was sort of just a recycled extension of the RX 400 family clocked higher. If that level of the game would suit your needs, you might even find an RX 480 8GB used on eBay for $120~, which might have a little risk - on the other side you're getting a solid 1080p gaming performer for next to nothing, and the only new card that really kicks its bacon in the 1080p league is a GTX 1060 6GB, which starts at $200-$250, or the GTX 1070 if you're stepping up to the 1440p league, which blows past $400 new. So your decision of values is left between deciding if $80 is worth another ~10fps on 1080p gaming, or if the same amount of money is worth a warranty. And if you really just need something now to smoothly handle 1080p if you have 1440 or higher ambitions, $130 is about the cheapest stop gap you'll get without feeling pain.
  7. Gaming wise, i7's are nice, but we're in the era of i9's now, which aren't necessarily for games. I'd look into the i5 9600k, its a $250 chip, so you're not looking at an upgrade that would dole out $500. And if your 1080 is truly being bottlenecked, that chip should get you past that. Having a van load of CPU cores isn't got to mean much in every game, as they vary in how they're built. GPU wise, I don't think you really have a great price per dollar move to make until at least 6 months from now, if AMD delivers on its RX 3000 promises. Otherwise, you're stuck waiting till 2020 in hopes that nVidias RTX 2000 successors make a 2070 cheap enough for you to nab. You already invested in a 1080 though, so I'd play on that strength at this point and let the markets make a nice price/value pathway forward for you. I wouldn't go out and buy 1440p or 4k displays though, if you're already having trouble driving them with your current rig you won't be able to address that added visual demand with what you've got. And my feelings are 4-6 months of patience could save you $300+ if AMD meets its mark, which is making 1440p possible on $250 cards. And if they succeed, then all the people who have been holding out on paying for a $350-$400 monitor and a $500 card to drive it will have a viable card at half the price, and the demand for monitors might see some nice price breaks due to two factors. One, an anticipated surge of orders (if AMD succeeds), and two - Lenovo's 240Hz beasts will devalue 144Hz models, likely triggering a sale. Patience has an opportunity to pay off this year.
  8. I'm in a similar boat, only I haven't the monitors to dream of 1440p yet. What I've been researching (and clinging to with some hope) is the AMD RX 3000 family stepping in to address that value problem of paying $500 to smoothly play 1440p games on a high refresh monitor. Supposedly, rumors now, nobody's gotten to crack at benchmarks yet - at some point in the first half of 2019 here AMD's supposed to release them. And the *rumored* price range is under $300, which means if their intended goal of matching or beating a 2070's performance is true, you might get a satisfactory card for your goals that delivers consistent 1440p performance (60+ FPS), and you might find with a couple of minor concessions in game settings 4K to be playable too. Do keep in mind, the higher in resolution you go to my understanding (anyone can correct me if I'm wrong) things like AA/AF and other filtering options can be toggled off because the equipment performing is meeting or surpassing what some of those technologies were meant to address on lower resolutions and lesser GPU's to drive them.
  9. Sort of, yes you can't allocate in the hard physical sense. But Minecraft does run on Java, and the Java applications can be tweaked to set a maximum memory request figure. For instance, specifying this in a batch file when launching an MC server you can tell it to use 512MB, 2GB, 3GB, 8GB, and so on. That said, for 12GB he better be inviting dozens of kids to play and running it off a linux server. Otherwise, anything from 2GB - 4GB on a desktop client is extremely generous. And if anything did become a runaway train creeping north to 12, his MC world would probably be on a collision course with a crash. I had to intentionally detonate hundreds of nukes in Tekkit Lite to make that happen, which is intentionally breaking the engine.
  10. Shot in the dark, any onboard graphics devices that rely on shared RAM stealing your last 4 gigs?
  11. If you close out of your stuff when you leave the chair, you'll likely not need more than 16 for a few years until bloat catches up. Youtube and other social media or complex scripted sites alone went from 30MB/tab to over 300MB in most cases. 4 browser windows, 4 tabs a piece in the background can easily munch 1.5GB-3GB of your memory depending on how much is going on with those tabs. So if you're tidy, you don't have anything to worry about. But if you're like me, I don't like bumping my head. So I understand that pickle. Windows, abandoned browser tabs and windows that you aren't done with, a few office and adobe programs open and so on add up pretty quickly to a system using 55%-65% of available memory, leaving 6GB-7GB left if you wanted to fire up a game on a 16GB dual channel kit. And some AAA titles, especially in the era of 1440p and with 4k on the horizon still are probably going to eat that up pretty quickly, leaving your recreational / workstation work butting heads with your gaming sessions. If you don't have all of that work to worry about, or don't leave that kind of tab load open, you would sail fine with 16GB for years to come. But I end up with things to study, research, find, work on and stuff that can get strung out for days. Saving it all is sort of a pain, I'd rather just have permission to be a desktop slob with my usage habits and not change my behavior. So the next system I get, will probably have 32GB of RAM, not because I'll use more than 20 of it, but because they don't make a kit for 20. And the difference between 16 and 32 of the decent speed stuff is $90. Knowing your CPU, SSD, GPU and everything else is in order, $60-$90 to never hit your head again because you're in the middle of too much isn't a bad deal.
  12. I just know if I was going to stick with 1080p for years to come, I could get a card to push that for $130, and monitors that I could live with at $150 each. But my visual experience wouldn't change a whole lot, textures would be a little nicer, but I'd be stuck with updated versions of what I have. For $200 more and some luck in possible upcoming price wars it might be just enough to get me into 1440 without the fear of struggle.
  13. My thought was if the 1080 ti follows the same pattern as the 980 ti in price drops and used value, I hope AMD's release of the RX 3000 series knocks prices down on 1080 ti's and 2070's. To do so people aren't going to cave until benchmarks actually come in, but rather than waiting till after the RTX 2000 series successor sometime in 2020... this might be a dark horse that makes 1440p viable on a $250 or less card. And that would blow away any concerns of skimming the water with a 1070. Lenovo's also releasing a new monitor this year, a family of 1440p's @ 240Hz which between AMD's RX3000 release and Lenovo's new family, I hope the price war and buzz beats down currently good cards and monitors a bit. Its half my reason for holding and doing my homework now, so when good deals emerge I can act quick.
  14. Surprisingly I did see one 1070 on the FB marketplace going for $175, and ebay had some in the 240-280 range. All used of course. My hesitation was seeing FPS rates for the 1070 at 1440p with high settings, compared to what the 1080 and 2070 was getting, it seemed like it would age a little too quickly under higher resolutions. I keep a TCL 43" 4k TV with built in Roku hooked up as my third display, but because I still use Windows 7 (a dying romance) I can't adjust DPI independently per display, so I had to downshift the resolution on the TV to 1080p to win back a refresh of 60Hz, because 30 was pretty rough. Reminded me of the Egyptian screen saver on Win98 in its choppy feel.
  15. Definitely give it a consideration, especially if it comes down in price much more than it is. 144hz would be nice, I do plan on having an RTX 2070, or, a 1080/1080 Ti when those come down in price too. I know the street value of a 1070 right now is in the 250 range, I'd like to have 1440p gaming viable for a card on that kind of budget without dipping into the 30's and 40's with frame rates, or being left in the mud a year later. I don't really work in premiere though, I do lightroom and photoshop, ones for chopping up and editing web images which usually aren't at full detail or resolution to keep them friendly. Lightroom is where the raw image files get tweaked, and that's more a hobby for now. So I'd say gaming would be as important as a solid resolution upgrade with good color. Budget wise I'd be doing it all incrementally, a few hundred at a time over months. Buy something, sell off what I was using before... buy another thing. I wouldn't be able to spring for $700 in monitors on top of a $400 GPU overnight..
×