Jump to content

Veninblazer

Member
  • Posts

    112
  • Joined

  • Last visited

Awards

This user doesn't have any awards

1 Follower

About Veninblazer

  • Birthday Jul 26, 1999

Contact Methods

  • Steam
    deathbat6916
  • Origin
    deathbat6916
  • Battle.net
    deathbat6916
  • Twitch.tv
    deathbat6916
  • Twitter
    deathbat6916

Profile Information

  • Gender
    Male

System

  • CPU
    i7-3770S
  • Motherboard
    Dell Optiplex 9010
  • RAM
    16GB Hynix DDR3-1600
  • GPU
    GTX 1050 Ti
  • Case
    Dell Optiplex Case
  • Storage
    250GB Samsung 850 Evo + 2TB WD Blue
  • PSU
    275W

Recent Profile Visitors

537 profile views

Veninblazer's Achievements

  1. i think the fact that i can get high+ in 2018-2019 titles and they end up looking better than 2077 at low speaks for itself but I digress
  2. its an oof especially for budget guys who don't really have $500+ to spend on a video card and so have to deal with worse visuals and worse performance because the game doesnt really scale down very well.
  3. The fact that 60 is basically impossible on a lot of hardware if you truly want next-gen quality (which, im sorry, but for this game, that means raytracing), is kind of a mega oof
  4. Yeah and honestly, I have at least a dozen games that look and run better on my specific set of hardware. To get a solid 60 while getting the visuals true to next-gen glory you're gonna need a 3090...straight up, and even then it's not guaranteed.
  5. Oh yeah. My Ryzen 5 3600 gets hit hard, on all 12 threads, can't even turn up the crowd density because the framerate drops are too bad.
  6. i respect the effort LSG made but that is so disrespectful to the art design that i'd rather just run it with better hardware. at least with my modded configs and 70-100% dynamic res I can maintain 60 ok-ish on my 1060 6GB and still stay true to the art design.
  7. Let's say, with Cyberpunk 2077. With DLSS, let's say Performance in this case, a card gets 80 FPS. But in Metro Exodus, that same card gets 90-100 FPS natively without DLSS, for instance. Even with the render resolution being far below native, Cyberpunk still can't even match Metro Exodus performance while not looking THAT much better. Surely, there must be some kind of optimization issue, but no, you need to upgrade and you're just poor for one game. Suuuuure.
  8. Exactly. "but I need my 8X antialiasing in Deus Ex or it's a poor port" says the enthusiast with a bottomless wallet
  9. And some games just straight up run like hot garbage on console but alright on PC. Just Cause 3 runs fairly decently for me but it took a ram upgrade to become fully stable.
  10. It's just I'm tired of people with bottomless wallets dumbing down the idea of what optimization means.
  11. A 960 is a decent card, I have a 1050 Ti and it's not much faster. BF1 for me at medium runs at about 100 fps no problem, but even on the i5 I was using a while back which is not really any slower than a stock 2500K I wasn't getting that terrible of framerate (mid 50s mostly). BF4 runs at Ultra and 70-80 for me too, so i think there's an odd issue in your use case. And it legitimately seems like entitlement, ala "I paid X for my system therefore all games must do X!"
  12. Why does it seem that since the last year of games has released, and the 10 series of graphics cards have released, that many PC gamers have just gotten plain fucking stupid in regards to how games are optimized? Like the standards for lower-end video cards has arbitrarily went up for some reason? I.e if a game drops a mid-level ($250-400) card below 60 with everything maxed, it's automatically indicative of "poor optimization," no matter what's on screen, and then Steam reviews are flooded with negative reviews as a result and any YouTube benchmark is flooded with comments about the supposed poor optimization. I'll tell you this much: if these games you scapegoated (and they all seem to be from the same company with a few exceptions, that's just ridiculous) were really that badly optimized, there's no way in hell I'd be able to run them as well as I do on my system, which is NOT a beast by any means - in fact, it's rather budget-end and would only be high-end by the standards of five years ago or mid-range by 2014's standards. Why the hell is "Ultra" the baseline, when that's almost always the afterthought setting? It isn't 2007 anymore, most games look just as good at high as they do at ultra yet the performance difference amounts to anywhere from 25-50% higher at High, yet Ultra is always used to scapegoat games. In fact, the only game people seem to justify with its max settings being as ridiculous as they are is Grand Theft Auto V, which was tested on a 1080 Ti by Joker Productions at 1080p and with every single thing turned up, and not even THAT could sustain a solid 60 at 1080p in his test. But, nobody calls GTA V "unoptimized" because of that. So why exactly are other games (such as Deus Ex: Mankind Divided, Rise of the Tomb Raider, Assassin's Creed Syndicate, Tom Clancy's The Division, Watch Dogs 2, and 2016 Hitman) scapegoated if even a GTX 1060 (mid-range card) is dropping at max settings? It's one thing if you're having random fps drops in certain areas for no reason (Fallout 4 for example), it has serious issues even at lowest settings (Nier: Automata) facing massive frame pacing issues (Dishonored 2), or there's no real difference between different settings in regards to framerate (Mafia 3 and Dark Souls 3), but it's another when your only issue is dropping below 60 at max settings (provided the rest of your system isn't bottlenecking, since some of these are CPU/RAM intensive titles too) and then you refund as a result of that alone.
  13. Yeah I'm just not convinced they will undercut that much. It's more believable that they'll match the 1080 Ti for about 1080 prices.
×