Jump to content

jones177

Member
  • Posts

    3,560
  • Joined

  • Last visited

Reputation Activity

  1. Agree
    jones177 got a reaction from NobleGamer in PC Builds Bottleneck Calculator   
    It is silly.
    I ran my 5900x, i9 11900k and i9 10900k with a RTX 3090 at 4k and it said the 3090 it too weak for the CPUs.
     
    In reality it is the other way around. I need more CPU to get more frames out of a 3090 at 4k. 
    I use games to get an idea when a CPU is no longer capable of getting the most frames out of a GPU since their % means somthing.
     
    I use the game Shadow of the Tomb Raider to compare CPUs with an equal amount of cores.
    Looks like this.
     
    5800X with SOTTR at the highest preset.
    GPU Bound        1080p    1440p       4k
    RTX 2080 ti          48%       98%     100%
    RTX 3080 ti          38%       84%     100%     
    With an i9 9900k
    RTX 2080 ti         36%       86%      100%
    RTX 3080 ti           2%       48%      100%   
     
    Then when I add frames it becomes more clear.
    RTX 2080 ti      1080p      1440p        4k
    i9 9900k          168fps      123fps      67fps
    5800x              160fps      122fps      73fps
    RTX 3080 ti 
    i9 9900k          169fps      154fps      96fps
    5800x              198fps      163fps      96fps
     
    So with this game an upgrade from a 2080 ti to a 3080 ti with an i9 99000k at 1080p is pointless. At 1440p is livable but not idle and at 4k it is still doing fine.
     
    SOTTR is at the extreme when it come to CPU bottlenecks so it is the worst case. There are games like Horizon Zero Dawn that are GPU bound even at 1080p so it is the best case.
     
  2. Like
    jones177 got a reaction from Fasauceome in EVGA 1000 G+ with non US cord   
    Thanks.
    Newegg support is useless on weekends so I have to wait till Monday.
     
  3. Like
    jones177 got a reaction from LogicalDrm in Any suggestions on how could I get more clients for this service I'm trying to offer?   
    I showed my Son the post since he is a vocational education teacher.
    He says my advise is dated and no longer relevant. 
    He was very convincing. 
     
    I will be sticking with things that I am currant on like GPUs and CPUs and leave the career advice to others.
     
    Thanks for replaying.
  4. Like
    jones177 got a reaction from TVwazhere in ZOTAC GeForce RTX 4090 graphics card picture has been leaked   
    I had the tall card issue with the 3090 ti. 
    I went with EVGA since their connecter is on the back so it would fit in an O11D.
     
  5. Agree
    jones177 reacted to xg32 in Fill all memory slots?   
    only use 2x16 with ddr5
  6. Informative
    jones177 reacted to RONOTHAN## in Fill all memory slots?   
    TL;DR: With DDR5, use 2 DIMMs when ever possible. 
     
    For the more complete answer, there are a number of factors at play. 
    DDR5 memory topologies are very weak right now, and 4 DIMM setups for the most part are just kinda broken. Default speeds drop down to something like 4000MT/s or lower (worse than DDR4) if it even works at all. ASUS has seemed to get it to kinda work, but it still doesn't really make sense for most people.  The reason why 4x8GB performed better on DDR4 compared to 2x16GB is because it was a question of memory ranks. More memory ranks leads to more rank interleaving and better performance per clock, though there are usually diminishing returns after 2 ranks (on DDR4, DDR5 is different), so quad rank and triple rank were barely different in performance for given settings and triple rank is only marginally faster than dual rank in some synthetic tasks. 2x16GB DDR4 DIMMs, unless you were very particular with the memory you were buying, was pretty much always single rank towards the end of DDR4's lifespan, and since ranks only needed to be present in the memory channel, running 2 DIMMs per channel (4 DIMMs total on a consumer motherboard) was the easiest way to guarantee effectively dual rank operation. Single rank operation was still preferred in a lot of applications though for quite a while, because the more ranks a memory controller has to control the more stress is on the memory controller and, depending on the memory controller, it can significantly reduce the max frequency achievable (5GT/s RAM, for example, is not super difficult to achieve on a single rank setup on a Ryzen 3000/5000 memory controller when desynchronized with a good motherboard and RAM kit, but with dual rank the memory speed drops to about 4GT/s if you're lucky). Dual rank does still make sense in a lot of situations, 10th gen and later Intel DDR4 memory controllers and Ryzen 3000/5000 series chips do perform best for daily with dual rank memory since either the memory controller can either handle running high speeds (Intel) or there just isn't much of a reason to be running that high a frequency in general (Ryzen).  With DDR5, the rank discussion is a bit different. The 8GB DDR5 DIMMs are all guaranteed to be 1Rx16, which (for a couple of reasons) is kinda like half rank memory (not exactly but close enough) and has significantly less performance than 16GB DDR5 DIMMs. Running 4x8GB DDR5 would be basically running the same as running 2x16GB DDR5, but with all of the downsides of running 4 DIMMs.  Even when running 4x16GB or 2x32GB setups (both of which are proper dual rank), DDR5 doesn't show nearly the same amount of benefit from rank interleaving that DDR4 did, so the performance difference between single rank DDR5 and dual rank DDR5 is very similar to the difference between dual rank DDR4 and quad rank DDR4 (it exists, but is only really noticeable in 1-2 benchmarks). Unless you need the capacity, there is no real performance benefit by going for 64GB of DDR5 compared to 32GB of DDR5, especially since, as with DDR4, it puts a lot more stress on the memory controller and would reduce the max speed you can reasonably expect (if you're just running XMP of 6200MT/s or slower though, that doesn't really matter since they should both be about equal for that). 
  7. Agree
    jones177 reacted to Somerandomtechyboi in How accurate is this GPU OC tutorial?   
    bullshit
     
    even if you max core volt that will not degrade a gpu just because if you can even adjust core volt via msi afterburner in the first place (most newer gpus dont even allow this for all i know) itll likely be capped well below the volt that degradation happens, killing the gpu without a voltmod is straight up impossible, cant even degrade the thing with volt in the first place then how are you gonna kill it with volt? 
     
    Though if you can adjust core volt then itd usually be for undervolting since on higher end gpus gpu core oc is dead and barely gives any performance benifit so why bother and just undervolt to save 50w+? And no you cant degrade or kill a gpu with undervolt if that isnt obvious enough already, worst thatll happen is instability with too low volt/too high clock
     
     
    What the hell is fan power limit? You mean gpu power limit? Thats accurate, even if you undervolt i think theres still a benifit in the form of better .1%/1% lows when the gpu occasionally power spikes for a few miliseconds
     
    This is more of a personal preference thing but when i gpu oc just memory clock +100 increments and gpu core +20 - 50 increments till crash then switch to smaller increments. When youve found pretty much the absolute limit back off by 30-50mhz so theres abit of headroom to mitigate any random artifacting or crashing
  8. Like
    jones177 reacted to emosun in Two GPUs without SLI... useful???   
    there fixed your comment. not sure why people are still typing in absolutes but the linus parrot theory is pretty sound. I get the feeling the only people that speak this way either are just repeating what they've heard or literally dont own any sli hardware from the past 20 years to actually try out.
  9. Like
    jones177 reacted to Mick Naughty in Two GPUs without SLI... useful???   
    Weird how the op clearly said without sli and the linus parrots come flying in saying how its dead. 
    We get it, you've never used sli before and even if you did, you didnt know what you were doing. 
  10. Informative
    jones177 got a reaction from Hi P in Is 4K gaming actually worth it?   
    I started doing 4k in 2015 at first for work but as soon as I saw my modded games at 4k I could not stop using it.
    The benefits in games is seeing more detail if it is to be had. 
     
    I find the detail in sims, building games and modded games.  
    I don't find it in most vanilla games.
     
    It has been worth it to me but it has been expensive.
    My first 4k gaming rig had 2x GTX 980 ti in SLI and that was replaced by 2X GTX 1080 tis in SLI. Then it was a single RTX 2080 ti with a 24/7 overclock. 
    Now it is 3090 tis that replaced 3080 tis and with them for the first time since I started doing 4k I am satisfied.
     
  11. Funny
    jones177 reacted to tsmspace in controller vs joystick   
    yeah a bit of a joke I've been having. 
  12. Agree
    jones177 got a reaction from da na in Recent gpu driver update for Nvidia(516.94)   
    No.
    Installed on 8/10 on my 5900x computer.
     
    You may having a "heat over time" issue. 
    It may not be your CPU or GPU but the VRMs on the motherboard so run HWiNFO64 while gaming to see if anything shows up.
  13. Informative
    jones177 reacted to Oliveros in Adobe Premiere Problem with Graphics Card RTX 3090 PC freezes, screens off and restarts! ​​​​​​​​​​​​​​VIDEO_TDR_FAILURE (116)   
    I ended up changing my GPU to an Asus TUF 3090Ti one day till now with no issues, will keep you updated if there's any!
  14. Agree
    jones177 reacted to Middcore in Is the Core 2 Duo E8400 the best of the Core Duo series?   
    Good by what standard? In 2022 it's trash. 
  15. Like
    jones177 got a reaction from Rokus in Feedback on new build with old case & i7 vs i9 dilemma   
    Since you are not really going high end with DDR5 I would say i7.
    Also your choice of cooler is a bit weak for an i9.
     
    As for performance, In most games you will still be GPU bound with either CPU at 4k. That is the reason I still use an i9 11900k and a 5900x in my gaming rigs. At the most I will get 3 frames at 4k going with a 12th gen i9 and the same frames with the i7.
    It was more worth while for about the same price going with a 3090 ti since they give me 10 more frames at 4k even over a 3090.
     
     
  16. Informative
    jones177 reacted to Ebony Falcon in Vccio voltages 8700k   
    I got 4.7 at 1.17v with cache at 4.5
    mac temp 67c in aids 64 and 10 loops of r23
     
    and the r23 score was 9100 down from 9700 so there not a great deal in it and temps are much much better so il probly stick with this in the summer 
     
     
  17. Agree
    jones177 got a reaction from Fasauceome in PC Builds Bottleneck Calculator   
    It is silly.
    I ran my 5900x, i9 11900k and i9 10900k with a RTX 3090 at 4k and it said the 3090 it too weak for the CPUs.
     
    In reality it is the other way around. I need more CPU to get more frames out of a 3090 at 4k. 
    I use games to get an idea when a CPU is no longer capable of getting the most frames out of a GPU since their % means somthing.
     
    I use the game Shadow of the Tomb Raider to compare CPUs with an equal amount of cores.
    Looks like this.
     
    5800X with SOTTR at the highest preset.
    GPU Bound        1080p    1440p       4k
    RTX 2080 ti          48%       98%     100%
    RTX 3080 ti          38%       84%     100%     
    With an i9 9900k
    RTX 2080 ti         36%       86%      100%
    RTX 3080 ti           2%       48%      100%   
     
    Then when I add frames it becomes more clear.
    RTX 2080 ti      1080p      1440p        4k
    i9 9900k          168fps      123fps      67fps
    5800x              160fps      122fps      73fps
    RTX 3080 ti 
    i9 9900k          169fps      154fps      96fps
    5800x              198fps      163fps      96fps
     
    So with this game an upgrade from a 2080 ti to a 3080 ti with an i9 99000k at 1080p is pointless. At 1440p is livable but not idle and at 4k it is still doing fine.
     
    SOTTR is at the extreme when it come to CPU bottlenecks so it is the worst case. There are games like Horizon Zero Dawn that are GPU bound even at 1080p so it is the best case.
     
  18. Like
    jones177 reacted to venomtail in Gaming screenshots   
    Trying to step up my screenshot game
    Portraits
     




  19. Informative
    jones177 got a reaction from Klamboshki in Budget Blender Build Suggestions   
    Yes, but is limiting with larger 3D projects. 
    I went to 32gbs mainly for Photoshop. 
    The free version of Da Vinci Resolve likes more ram as well.
    Yes, but the you may want more cores latter on.
    It would not make a difference if you were GPU rendering
    How much 3D are you not going to do in that time.
    There is always a risk.
    I paid $2,300 each for my RTX 3090s and they cost half that now. Sometimes you have to pay to play.
    No.
    It depends on your rending size. Mine was 8k plus so lots of big files to move around. 
     
    Also your system is as slow as your slowest drive so if you have HDDs on the system it will not matter.
    No.
    No.
    It is not a good idea to sell at all doing 3D unless the computer/parts are no longer capable of doing 3D.
    I used dedicated rending machines and did pre/post production or gaming on the computer that was not rendering.
    I started doing 3D as a hobby but it turned out to be the way I made a living so my view on these sort of things may be a bit different. 
    You may have to do an bios upgrade. I did with my X570.
     
  20. Agree
    jones177 reacted to GuiltySpark_ in RTX 3080 idles at 100+w   
    Yeah I mean, he's only told us the power draw so far. If something is actually using the GPU it might be easy to spot in Task Manager. 
  21. Funny
    jones177 reacted to 8tg in What do you think about 4K?   
  22. Agree
    jones177 got a reaction from xg32 in Is my power supply ok for 3080TI?   
    I think you will need to find out if you will have a problem or not because some people do and others don't.
    I have had problems in the past so I like to use more PSU than I need.
     
    The worst that can happen is that the GPU will not turn on with the computer and the problem with that is you won't know if it is the card or the PSU.
    When that happened to me I had another computer with a 1000 watt PSU to test so it was not an issue.
     
  23. Agree
    jones177 got a reaction from CommanderAlex in 10700k cooling question.   
    For my 8 core Intel CPUs I use Noctua nh-d15s. Even my i9 11900k is on one. 
     
    Usually for more cores or if a D15 does not fit in a case I use 360mm AIOs(EK).
     
    I have a 3080 ti and 3090 ti in with the D15s but they still game in the 50s and 60s.
  24. Informative
    jones177 got a reaction from Bobbysixjp in Performance of a lower power 3080Ti vs higher power model   
    Yes and no.
    The FTW3 is smoother overclocked.
    The XC3 can be rough overclocked. It stuttered in some benches.
     
    The FTW3 is hotter and harder to manage.
    It is too hot for the Cooler Master H500 ARGB that the XC3 is in so it is in its box. It did fine with the Lian Li O11 Dynamic.
     
    If I was buying another 3080 ti it would be the Strix. It is bigger than the EVGAs and much cooler.
    I had the 3080 version. Even using 450 watts it was cooler than the FTW3 at stock.
     
  25. Informative
    jones177 got a reaction from Bobbysixjp in Performance of a lower power 3080Ti vs higher power model   
    Hard to say since I don't know the brands but with EVGA it is definitely worth it.
     
    I have an EVGA XC3 Ultra 3080 ti(350 watts stock) and a FTW3 Ultra 3080 ti(400 watts stock).
    Both have been in my i9 9900k computer.
     
    Time Spy.
    XC3 Ultra    = 17180, Graphic 18871, CPU 11397
    FTW3 Ultra = 18144, Graphic  20425, CPU 11112
     
    Shadow of the Tomb Raider
    SOTTR highest preset                                   1080p      1440p        4k.
    EVGA XC3 Ultra 3080 ti(350/366 watts)        168fps      143fps      83fps     
    EVGA FTW3 Ultra(400/450 watts)                 169fps      154fps      96fps
     
    Horizon Zero Dawn
    HZD at Ultra                                                  1080p      1440p        4k.
    EVGA XC3 Ultra 3080 ti(350/366 watts)        147fps      128fps      79fps     
    EVGA FTW3 Ultra(400/450 watts)                 155fps      138fps      88fps
     
    With the 20 series the difference between my XC 2080 ti and my FTW3 Ultra 2080 ti was the cooling. 
    With the 30 series at least with EVGA I think they are also binned. I only have one sample so I don't know for sure.
     
×