Jump to content

Zando_

Member
  • Posts

    15,591
  • Joined

  • Last visited

Reputation Activity

  1. Like
    Zando_ reacted to tommyT1 in red dead redemption 2 maxing out 3070ti graphics card   
    Yeah i have been looking into that, I should really do some experimental overclocking once I find the courage to lol.
  2. Agree
    Zando_ got a reaction from tommyT1 in red dead redemption 2 maxing out 3070ti graphics card   
    It... can. I have had server CPUs from 2008/9-ish, still have a bunch from 2011-2012. I have a GPU from around 2008 in my Mac Pro from 2006. They'll probably die at some point, maybe, possibly, most likely from capacitors eventually failing on the motherboard or GPU PCB, not the core dies themselves. Dying from use... never had that happen. If your hardware is kept cool and within safe voltage tolerances it will not die from just doing its job, realistically ever, by the time it does it'll be entirely useless for any task. 
    Bringing the power target down should be a better way of fixing that. You can easily drag it down using MSI Afterburner, it has a slider for it. 
    It's great design. They put the choice on how the game runs in your hands. People who want to run higher frames can do so, if you don't want to you can cap it. If they did the cap on their end, all the people who want high framerates would be shit outta luck. 
  3. Agree
    Zando_ reacted to Radium_Angel in Help me choose between 2 pre-builds   
    What are the rest of the system specs? That is equally important
  4. Agree
    Zando_ got a reaction from Levent in red dead redemption 2 maxing out 3070ti graphics card   
    As @Levent said, because it's GPU limited (in other words, it wants all the GPU horsepower it can get its hands on, always). You can remove the GPU limitation by lowering settings and setting an fps cap. If you don't, it'll just run higher fps and you see the same usage. Rockstar did an impressive job with RDR2, it runs exactly as a game should run, it will use all the GPU possible and a good chunk of CPU to run as fast as possible. You have to manually force it not to. Not sure why this is an issue, it's expected behavior for a game, it would only be a problem if you were seeing 100% usage but getting unplayable framerates, then something would be weird. 
     
    Is there a reason you think it shouldn't use 100% of your GPU? It's not harmful, GPUs are designed to run at 100% for years upon years, usually people want games to take advantage of their hardware, not just leave it idling. If you're concerned with power draw or something, just pull down the power target for the GPU so it boosts lower and pulls less power. Should keep more consistent frametimes than setting an fps cap, I've never found those to be a good experience. 
  5. Like
    Zando_ got a reaction from tommyT1 in red dead redemption 2 maxing out 3070ti graphics card   
    As @Levent said, because it's GPU limited (in other words, it wants all the GPU horsepower it can get its hands on, always). You can remove the GPU limitation by lowering settings and setting an fps cap. If you don't, it'll just run higher fps and you see the same usage. Rockstar did an impressive job with RDR2, it runs exactly as a game should run, it will use all the GPU possible and a good chunk of CPU to run as fast as possible. You have to manually force it not to. Not sure why this is an issue, it's expected behavior for a game, it would only be a problem if you were seeing 100% usage but getting unplayable framerates, then something would be weird. 
     
    Is there a reason you think it shouldn't use 100% of your GPU? It's not harmful, GPUs are designed to run at 100% for years upon years, usually people want games to take advantage of their hardware, not just leave it idling. If you're concerned with power draw or something, just pull down the power target for the GPU so it boosts lower and pulls less power. Should keep more consistent frametimes than setting an fps cap, I've never found those to be a good experience. 
  6. Agree
    Zando_ reacted to Levent in red dead redemption 2 maxing out 3070ti graphics card   
    That means game is GPU limited. There is nothing to be confused here. As everyone here said, RDR2 very demanding game to run.
  7. Like
    Zando_ reacted to PDifolco in is my 3080 still good today   
    Calling a 4080 "pretty cheap" nowadays is a pretty bold statement 😛 
    What can justify upgrading from a 3080 (that's just what I did 1 month ago!)  is to have better gameplay at UW 1440p or 4K, in demanding games (JS, LoU, CP2077, HL, TWW3..) and with less VRAM limitations (mine was a 10GB model)
    But indeed if you don't notice bad framerates or frame drops no need to upgrade...
  8. Agree
    Zando_ got a reaction from Levent in red dead redemption 2 maxing out 3070ti graphics card   
    ^^^ Unless you weren't able to run it playably, there doesn't seem to be an issue. RDR2 will take everything you can throw at it, if you drop settings but do not have an fps lock in place, it'll just run a higher framerate, still maxing out the card. Even on all low I don't think most GPU/CPU combos can cap out... whatever the frame cap is for the engine Rockstar used. 
  9. Agree
    Zando_ reacted to Levent in red dead redemption 2 maxing out 3070ti graphics card   
    RDR2 is also very CPU heavy. I bet your issue is not the GPU but CPU or your RAM. I played over 100 hours of RDR2 on a 5800X and RTX3070, zero crashes.
     
    Corsair RAMs are notorious for not being PnP on AM4.
     
     
    Also yeah, RDR2 is not solitaire, it will max out whatever you throw at it.
  10. Agree
    Zando_ reacted to filpo in Can I reuse my NVME heatsink?   
    the mobo heatsink? Reuse it
  11. Like
    Zando_ reacted to MantraWeasel in Show off Your Setup! (Rev.2)   
    AMD Ryzen 3900X
    MSI B550-A PRO
    Corsair LPX 16GB (2X8)
    NVIDIA RTX 3070
    Corsair 4000D
    Seasonic Focus GX-850

     
    My domain/ WFH Room. PC is on a trolley underneath the desk. 
     

     
    Bonus pre-build picture.
     

  12. Like
    Zando_ got a reaction from Hravec in Resize BAR on 5700g   
    reBAR is an optional PCIe feature: https://www.nvidia.com/en-us/geforce/news/geforce-rtx-30-series-resizable-bar-support/. 
     
    If the iGPU on the 5700G is connected via PCIe lanes, and the motherboard OEM displays the option, you may be able to. 
  13. Like
    Zando_ reacted to elderan in X299X DESIGNARE 10G not detecting 256gb RAM but showing in bios.   
    Task Manager sees 128. 
  14. Like
    Zando_ reacted to Crunchy Dragon in EVGA X299 DARK Guide...   
    FTW-K 😉
     
    At this point, I would be a little more concerned with keeping it cool rather than keeping it fed. Maybe try running 4.4 or 4.5Ghz for a while and see how it does.
  15. Like
    Zando_ got a reaction from blackhose746 in EVGA X299 DARK Guide...   
    With an easy OC and a delid (for 7000 series, the 10s are soldered already so they'll be a bit hotter but not as bad) a 280mm AIO will be fine. My 18c chip will reach up to 80-90C under sustained AVX load, but at voltages barely above stock I'm not terribly worried about that. In general use or games it doesn't step out of the 70s. 
  16. Like
    Zando_ reacted to Fred Castellum in EVGA X299 DARK Guide...   
    Your right, the overclocking part flew right over my head. Probably shouldn't be posting here while at work lol.
  17. Agree
    Zando_ got a reaction from Fred Castellum in EVGA X299 DARK Guide...   
    Oh god nowhere close. OP mentioned overclocking. Stock these chips are docile, OCed and the 16-18c ones can pull ~500W, the 12c will not be that far behind. 
     
    For HEDT, always use both 8 pins if there's slots for them. Any XOC board with an 8+8 has those for a reason, it can and will deliver enough power to the PC to need those. I have a smol ATX board with only a single 8-pin, but that's about it. My Classifieds and Darks all have 8+8 and the CPUs on X99 and X299 can easily pull north of 300W, my killawatt broke before I could test with my X79 stuff but I'd assume that gets high too, though my i7 for that is only a 6c so it wouldn't match the 8/10/18 core chips I have/had on the other platforms. 
     
    I'd get a 1000W PSU and just chill with that, though if you're going to move to a very beefy watercooling setup (my friend used a MO-RA3, giant external radiator) you can push a 7980XE to north of 700W IIRC. The 12 core would be lower, but still quite chunky power draw. 
    ^^^ basically this, you can squeak by if you have an external power monitor and do the math to see what you're pushing the CPU to (I have found software monitoring useless, it tries to tell me my 7980XE is pulling 14W but given the heat it dumps into my room I suspect that is a lie). If you want peace of mind, just get an appropriate PSU. 
  18. Like
    Zando_ got a reaction from blackhose746 in EVGA X299 DARK Guide...   
    Oh god nowhere close. OP mentioned overclocking. Stock these chips are docile, OCed and the 16-18c ones can pull ~500W, the 12c will not be that far behind. 
     
    For HEDT, always use both 8 pins if there's slots for them. Any XOC board with an 8+8 has those for a reason, it can and will deliver enough power to the PC to need those. I have a smol ATX board with only a single 8-pin, but that's about it. My Classifieds and Darks all have 8+8 and the CPUs on X99 and X299 can easily pull north of 300W, my killawatt broke before I could test with my X79 stuff but I'd assume that gets high too, though my i7 for that is only a 6c so it wouldn't match the 8/10/18 core chips I have/had on the other platforms. 
     
    I'd get a 1000W PSU and just chill with that, though if you're going to move to a very beefy watercooling setup (my friend used a MO-RA3, giant external radiator) you can push a 7980XE to north of 700W IIRC. The 12 core would be lower, but still quite chunky power draw. 
    ^^^ basically this, you can squeak by if you have an external power monitor and do the math to see what you're pushing the CPU to (I have found software monitoring useless, it tries to tell me my 7980XE is pulling 14W but given the heat it dumps into my room I suspect that is a lie). If you want peace of mind, just get an appropriate PSU. 
  19. Like
    Zando_ got a reaction from blackhose746 in EVGA X299 DARK Guide...   
    I'd assume you can push similar to what I run fine, I'm at basically stock voltage running 4.0GHz (3.8Ghz under AVX load due to the offset), I find that runs all my games and stuff acceptably so I haven't pushed it much farther (I'm on an AIO and just don't have the energy to tweak chips much anymore). Power draw would be a good bit over stock still (as power draw scales with clocks not just voltage) but not the crazy high numbers, those are all at 1.2v and above. Basically what @Crunchy Dragon said (he has a Classified IIRC, the more-creature-comforts version of the Dark so he can also provide info directly from experience). 
  20. Agree
    Zando_ reacted to Crunchy Dragon in EVGA X299 DARK Guide...   
    If you don't want to do any serious overclocking, 850W should be fine.
     
    If you want to really push and see how far you can go, 1000W is the absolute minimum I'd consider, and I'd probably be looking for 1200W.
  21. Agree
    Zando_ got a reaction from Crunchy Dragon in EVGA X299 DARK Guide...   
    Oh god nowhere close. OP mentioned overclocking. Stock these chips are docile, OCed and the 16-18c ones can pull ~500W, the 12c will not be that far behind. 
     
    For HEDT, always use both 8 pins if there's slots for them. Any XOC board with an 8+8 has those for a reason, it can and will deliver enough power to the PC to need those. I have a smol ATX board with only a single 8-pin, but that's about it. My Classifieds and Darks all have 8+8 and the CPUs on X99 and X299 can easily pull north of 300W, my killawatt broke before I could test with my X79 stuff but I'd assume that gets high too, though my i7 for that is only a 6c so it wouldn't match the 8/10/18 core chips I have/had on the other platforms. 
     
    I'd get a 1000W PSU and just chill with that, though if you're going to move to a very beefy watercooling setup (my friend used a MO-RA3, giant external radiator) you can push a 7980XE to north of 700W IIRC. The 12 core would be lower, but still quite chunky power draw. 
    ^^^ basically this, you can squeak by if you have an external power monitor and do the math to see what you're pushing the CPU to (I have found software monitoring useless, it tries to tell me my 7980XE is pulling 14W but given the heat it dumps into my room I suspect that is a lie). If you want peace of mind, just get an appropriate PSU. 
  22. Agree
    Zando_ reacted to RONOTHAN## in EVGA X299 DARK Guide...   
    If you don't overclock, yes. If you do overclock, X299 gets very power hungry very quickly, and you're easily able to overwhelm a single 8 pin without trying that hard. If you're monitoring power consumption and are making sure it's below~350W the entire time, you can probably be OK, but you really should be looking into a higher wattage unit if you really want to max out a Skylake-X chip. 
  23. Agree
    Zando_ reacted to matt0725 in If ltt uses 12k cameras, why isn't his image sharp?   
    along with the reasons everyone else gave, they use all kinds of lenses, and its possible to be out of focus for a moment(or more), which will be blurry/not sharp even if they recorded in 120k
  24. Agree
    Zando_ got a reaction from TylerD321 in If ltt uses 12k cameras, why isn't his image sharp?   
    Few reasons:
    They only (AFAIK) post up to 4K videos. As others noted, YT compression crunches said videos. your player says "HD", meaning you are viewing 1080p or 1440p, so it's even more downscaled and crunched than 4K would be.
  25. Like
    Zando_ reacted to stanley16 in First PC Build AMD (replacing mid 2012 macbook pro retina)   
    Finally building my first custom PC to use for my design software and some gaming. (after watching alot of LTT videos)
    This will be replacing my old mid 2012 macbook pro retina.
     
    https://pcpartpicker.com/list/cpCqTn
     
    I highly recommend this Meshify 2 Mini case, it was super easy to build in!
     
     

     
×