Jump to content

jones177

Member
  • Posts

    3,560
  • Joined

  • Last visited

Posts posted by jones177

  1. 4 hours ago, trevykb said:

    On a separate note, I’m upgrading my psu from a Corsair rm650 2021 to a corsair rm850x, can I use the same cables or should I swap all of them with the one coming with the psu?

    There are experts in the Power Supplies part of the forum so someone there may be able to answer the question.

     

    I use the cables that come with the PSU. 

    I have 3 Corsair PSUs and 2 have the same pinout and one is different. Also my EVGA PSUs have 2 different pinouts. 

    I won't even install one unless the old one is in its box and put away.

     

  2. 1 minute ago, Needfuldoer said:

    What could have happened was a spambot bumped that thread, then you replied to it, and the spambot post was deleted.

     

    You're 100% sure you were on page 1?

    Yes.

    It has been months since I have even clicked on page 2.

     

    I am also usually on the lookout for resurrected posts so I was having a brain fart not checking it first.

    10 minutes ago, Levent said:

    That thread was necro'd.

    I had to look that one up.😄

    4 minutes ago, LogicalDrm said:

    The post was necroed. That happens. Thats why there's post date below all thread titles. We remove necro posts if they don't contribute to discussion or are made to really old posts.

     

    This is not a bug, it just how internet forums work. Or rather how internet works.

    I need to be more observant in the future.

    6 minutes ago, rcmaehl said:

    !RemindMe 2 Years Necro this post. /s

     

    Unfortunately, Necroing just happens. I'm glad people are using the search function though!

    I did not even know it was there until you mentioned it. 

    If I goggle anything to do with computers, LTT posts do come up and they usually have better answers than Tom's Hardware or Reddit.

  3. On 9/25/2020 at 7:13 PM, Camelsmaycry said:

    Hey

     

    Just wondering how can I bypass the power limit to get to 2100Mhz on my Zotac 2080 ti Amp 

     

    Thanks

     

    That may be more of a silicon thing.

     

    I had 2 EVGA FTW3 Ultra 2080 tis. One could hit 2,145mhz and the other would crash after 2,085mhz. 

    Both used the same power(373 watts).

     

    To maintain over 2,100mhz the GPU had to stay below 55c and that was hard to do for any length of time air cooled and in a case. 

     

    Both cards could run at 2,040 mhz with the stock fan curve and a 24/7 overclock so I was happy with that.

     

     

     

     

  4. My i9 9900k is using a budget 3080 ti now but my i9 10900kf is using a 3090.

     

    This is what it does at stock in Heaven.

    FPS:76.7

    Score: 1933

    Min Fps : 38.7

    max fps :147.1

     

    Your score is about right for a i9 9900k. 

     

    Most scores I see online with Heaven are at 1080p with the same settings you used.

    You may want to add that score so more people can relate to it. 

     

  5. I have the EVGA FTW3 Ultra version. They are $1,200 on the EVGA web site right now.

     

    Doing what we are doing now or YouTube they use 25 watts. The same as a XC3 Ultra 3080 ti. 

     

    They are 10c cooler on the core and 20c on the vram under load than the 3080 ti version.

     

    At 4k they produce 10 more frames over the 3080 ti version with my R9 and i9.

     

    I have 3090s and the 3090 tis are cooler and faster but not fast enough to be an upgrade. 

     

    I am also not a fan of AIO cards. 

    Out of the 6 360mm AIOs I have bought in the past 3 years 2 have failed. 

    I don't mind spending $130 to replace an AIO but I would not like to RMA a $1,000+ GPU over one.

     

    I have one 3090 ti on a 1000 watt PSU and it is doing fine. 

     

  6. 7 minutes ago, 99Carnies said:

    So you're thinking I'm maxed out at 50% cpu usage aka 4 out of my 8 cpu cores and it's a CPU bottleneck at 1080p settings? It makes sense especially when my temps are low.

    Yes.

     

    I also ran a Strix 3080 with the i9 9900k and at 1440p and 4k in was in between the 2080 ti and 3080 ti but the same at 1080p.

     

    SOTTR      1080p      1440p        4k

    2080 ti      168fps      123fps      67fps

    3080         167fps      144fps      87fps

    3080 ti      169fps      154fps      96fps

     

    There are games that are not CPU bound at 1080p so the i9 does well in them.

    One is Horizon Zero Dawn.

    HZD FTW 3080 ti    1080p     1400p       4k

    i9 9900k                   155fps    138fps    88fps
    5800x                       156fps    138fps    89fps 
    5900x                       156fps    138fps    88fps   
    i9 11900k                 153fps    137fps    89fps
    i9 10900k                 157fps    138fps    87fps

     

    These games are less common and most games are somewhere between the two.

    7 minutes ago, 99Carnies said:

    In this case should I switch to AMD and upgrade my motherboard to fit a new AMD chip meant for 1080p/1440p? I don't really care for 4k at the moment so I'm not stressed about getting an Intel CPU for better performance at higher quality.

    The newer chips from AMD and Intel have higher IPC so higher frames at 1080p in CPU bound games. 

     

    If you need the frames now and don't mind using Windows 11 get an i9 12900k.  You can also wait for next gen since you are not exactly suffering with the i9.

     

    I am waiting for a price drop since the i9 12900k is a little too expensive right now.

     

  7. That is what they do at 1080p.

     

    I have an i9 9900k that had a FTW3 Ultra 2080 ti and a FTW3 Ultra 3080 ti.

    This is what it looks like with Shadow Of The Tomb Raider.

     

    i9 9900k 

    SOTTR      1080p      1440p        4k

    2080 ti      168fps      123fps      67fps

    3080 ti      169fps      154fps      96fps

     

    A 5800x had different results.

     SOTTR      1080p      1440p        4k

    2080 ti        160fps      122fps      73fps

    3080 ti        198fps      163fps      96fps

     

  8. 2 hours ago, Dedayog said:

    Completely agree.  Their products seem to be top notch, it was just this limited experience with Customer Service and PSU's. 

     

    I can recommend the PSU's but I'm not a fan of their CSR.

    I think we need a "Why we don't buy X product from X company" post.  

    It would be really interesting.

     

  9. 3 minutes ago, Dedayog said:

    They're still $170, but I'd stay away from EVGA customer service if you have an issue.  

     

    I'd look at a higher tier PSU for a 3080 IMO.

    The 1000 watt version is $130. 

     

    I am using 5 EVGA PSUs with no issues. Two are running 3090 tis and two have 3090s. 

    Two replaced Corsair 850s that I was having issues with.

     

    I think it is luck of the draw whether you get a good one or not.

    8 minutes ago, Bad5ector said:

    Really? I've only heard good things about their customer service. But personally never had to deal with them, so wouldn't know first hand.

    I have done 2 GPU RMAs with them but never a PSU.  They may deal with them differently.

  10. 3 minutes ago, Tena said:

    So just like @_Omega_ , would you recommend getting a higher PSU? I dont mind spending more for it, also, with all the comments about the card running hot I will probably get more fans as well to keep it cool.

    I usually buy 1000 PSUs but I get them on sale for about the price of a 850 watt PSUs.

    My last one was 1300 watt G+ from EVGA for $170.

  11. You will not get the frames you would if you had a 5800x and a B550 but the card will run on your system with a 850 watt PSU.

     

    The issue I found with 3080 tis is keeping them cool. They are silly hot compered to GTX 1080 tis and 2080 tis. 

    All my cases either had to be modified or replaced to keep them as cool as those older GPUs.

     

    Good luck with your new GPU.

     

     

     

  12. 2 hours ago, Combat Stackz said:

    I'm a noob to all of this so I'm a bit confused as to which is better. I see that your fps with the EVGA XC3 Ultra 3080 ti performed the worst at 4k, and with mines being the just the 3080 12gb, I'm assumung it would be even worse. Maybe I'm incorrect here. I'm not sure if I should cancel my order and get a different brand like ASUS.

    A Strix would be faster but it may not be fast enough to make a difference. 

     

    The 3 cards I have in the post played the same in games. 

    The FTW3 Ultra is more fun for overclocking but the extra 50 watts was hard to manage. 

    If I put the FTW3 in the case that the XC3 is in, the temps under load would go from 73c to 80c.  I would have to either undervolt or recase. 

     

    That is why I kept XC3 Ultra. Is a better daily driver.

     

     

     

     

     

     

  13. 6 hours ago, Lairlair said:

    Hi folks,

     

    Lately I've been thinking a lot about how 4K isn't that great.

    For a 55" 4K TV you'd need to be 1 meter (aka 3.2 feet) away from it to start seeing the difference between each pixels, and around 5 meters away (16.4 feet) you wouldn't physically be able to notice the difference between full HD and 4K. I know a more realistic scenario is rather to sit somewhere in between, so that the TV takes about 30° of your viewing angle (as seen in this LTT video), and that would be about 2m/6.5feet away. But still, at that distance, if you swap from a full HD to a 4K resolution on a film/video game... How many would notice the difference? I'm aware that I'm talking to a tech savvy crowd, but I'm sure most people around me wouldn't notice. Even I have worked in a cinema for a few years, and they only used 2K projectors. Once I went to another cinema that used 4K and I couldn't for the life of me see any benefits in the image quality.

     

    So this is my tepid take, I'll admit that 4K does have some little benefits for the average Joe, can be useful for enthusiasts or people working in video / image creation. BUT the flip side is that it's a lot more expensive and I'm not just talking about moneys. It uses more energy (for the TVs but also for the servers to stream 4x more pixels and the graphics cards to process all that) and requires to upgrade a whole line of production for this to even start making sense (so we're talking mineral extraction, refining and assembly for all the new cameras, monitors, computers and other pieces of equipment). And for what? That you can tell Chris Hemsworth's beard hairs apart? Is it really worth it?

     

    What do you think? Do you use 4K and like it? Would you recommend it and why? Are you also caught between being excited for new tech and hating how dirty its production is?

    I am a meter away from my OLED and I don't see pixels. 

    Here is my setup.

    Spoiler

    20211224_084149.thumb.jpg.1357b4e3cebb5a4a54936f66a35ec287.jpg

    It is worth it to me.

    I started in 2015 and it was mainly for my modded games that used 2 and 4k textures. 

    It was also great for space and plane sims since distant objects have more detail.

    Since I can easily do 1000s of hours in these types of games, 4k is worth it.

     

    Most games don't have the detail in the textures to benefit from it or they have poor quality LODs that make distant objects look even worse.

     

    Before I started doing 4k I usually bought XX70 cards like GTX 470, 570, 670 and 970 but after 4k it was a 980, 2x 980 tis in SLI and even 2x 1080 ti in SLI.

    Now it is 3090s and 3090 tis. 

     

    Now it is only expensive if you want to be close to 120hz as well.

  14. My FTW3 Ultra 2080 tis had 24/7 overclocks at around 2040 to 2070mhz. It was worth it with those cards for that gen.

    My 30 series cards are stock because it is not worth it. 

     

    What I got was about 7 frames at 4k and that put most frames over 60 in games like Assassin's Creed Odyssey.

    With VRR as good as it is now it is probably not a big deal but in 2018 it was.  

  15. 20 minutes ago, Combat Stackz said:

    I see people on YouTube running their games at ultra settings. At least that's what the titles say. I'm a tech noob so I don't have the slightest clue.

     

    I do know that it looks cool and I want whatever allows me to run games on ultra high settings. (Except for the 3090)

    A high end 3080 ti like a Strix OC, FTW3 Ultra, Suprim or the equivalent will will do about the same frames as a 3090.

     

    GPU models are not the same so a entry level 3080 ti like my EVGA XC3 Ultra can game like a 3080 12gb and the top tier cards like my FTW3 Ultra are about 10 frames faster.

     

    This is what it looks like with an i9 9900k.

     

    Shadow of the tomb Raider

    SOTTR highest preset                                   1080p      1440p        4k.

    ASUS Strix 3080(370/450 watts)                   167fps      144fps      87fps

    EVGA XC3 Ultra 3080 ti(350/366 watts)        168fps      143fps      83fps     

    EVGA FTW3 Ultra(400/450 watts)                 169fps      154fps      96fps

     

    Horizon Zero Dawn

    HZD at Ultra                                                  1080p      1440p         4k.

    ASUS Strix 3080(370/450 watts)                   134fps      118fps       80fps

    EVGA XC3 Ultra 3080 ti(350/366 watts)        147fps      128fps      79fps     

    EVGA FTW3 Ultra(400/450 watts)                 155fps      138fps      88fps

     

    These are at ultra but with the settings I like it is easy to get over 100 with The FTW3 Ultra 3080 ti.

     

  16. 58 minutes ago, x Princess Leliana x said:

    Idle on desktop is 50c ish with spikes. Idle in this game is 60-75.  I have a triple rad AIO, ambient temperature is around 26-30, im in a hot room during summer, and hoenstly I do not know if its intake or exhaust, its on the top of my pc, with the fans on the inside of the pc, the fans have the "Open" section (without the cross bracing behind" facing the computer. 

    It is probably the hot room.

    My 5900x has low temps in benches and idle with a EK 360mm AIO but it is in a room that is at a constant 21c all year round.

    If I put it in a bedroom under a desk it will run a lot hotter. 

     

    Also some cores run hotter than others.

    This is my 5900x after a R20 run. As you can see the cores 09 and 10 are hot but the rest are not. Under load they are about the same.

    These hot cores bring up the average temperature.

    Spoiler

    R20stock.thumb.png.0bde3ac8d854ceeb069d0bb56df151cd.png

     

     

     

×