Jump to content

jones177

Member
  • Posts

    3,560
  • Joined

  • Last visited

Posts posted by jones177

  1. On 7/22/2022 at 6:39 PM, Wompusdude said:

    I was thinking about that, but the issue is intermittent. So I could very possibly test it on one of their computers and not have the issue arise at that time, even if it is due to a bad card.

     

    Also do u recommend any cables?

    Then I would return it and roll the dice again with another card.

    It is a buyers market.

     

  2. 4 hours ago, Wompusdude said:

    although it’s the same DP cable I’ve used for years with my 980 Ti with zero issues. 

    One of my FTW3 Ultra 3090 tis did not work properly with a cable that was fine with a RTX 3080 ti. The 3080 ti had the same issue with a cable that was fine with a 2080 ti.

     

    I would test the card in another system or at Micro Center to see it it is your system or the card.

  3. 10 minutes ago, farmfowls said:

    Thanks for these numbers! Great to actually have a real world comparison. If you had to choose between the two, taking value into consideration and longevity, which would you go for? And do you find there's a justifiable difference between the 3080 and the 3080 ti? I know it's a big price jump. But I also don't plan on upgrading any time soon, as this will be my card for the next few generations. Hopefully I won't be too limited with an I7-8086k as well.

    I was not unhappy with the power of the Strix 3080. I did sell it for a lack of vram. If it was a 12gb version I would still have it.

     

    The 3080 ti is not a big jump from a 3080. Only the 3090 ti feels like an upgrade to me.

     

    I have an i7 8086k. It is using an MSI Gaming X Trio 3080 ti. Compared to the i9 9900k it dose fine as long as the game does not use the extra cores. 

    Shadow of the Tomb Raider does us cores so this is how it looks.

    SOTTR with a 3080 ti                                      1080p       1440p         4K

    i7 8086k(6 core)                                               156fps      147fps       95fps 

    i9 9900k(8 core)                                               169fps      154fps       96fps

    i9 10900k(10 core)                                           189fps      164fps       97fps

    R9 5900x(12 core)                                            211fps      175fps     104fps

     

    At 4k it does fine against the more modern CPUs.

     

  4. 1 hour ago, farmfowls said:

    Okay so long story short, my GPU died and I need a new one. Stock wise and between the the choices I have, the only thing I can get where I live is an Asus TUF OC 3080 12GB ($1000) and an EVGA FTW3 Ultra 3080 ti ($1400). Could have gotten the Rog Strix 3080 12GB but it just went out of stock, and there are MSI and Gigabyte options but I am not familiar with those brands plus I've heard a lot of drama over the past few years with them. I know we're talking about a 3080 and a 3080 ti (plus a huge price difference) and I'm leaning more toward the 3080 12GB side, but recently read that the Tuf (I've only had Strix cards myself) is a two power connecter card and they've had power limit issues. Should I be worried about this? Is it issue enough to justify spending $400 more on a different card? I want to be keeping it for the foreseeable future and don't really plan to sell my card and upgrade. Just want something I can use for the next few years. I'm coming from an Asus Strix OC 1080 Ti. 

    I have a FTW3 Ultra 3080 ti and did test it against a Strix 10gb 3080.

    The Tuf should be close to the 10gb Strix.

     

    This is how it did with a i9 9900k.

     

    Shadow of the Tomb Raider                           1080p       1440p       4K

    ASUS Strix 3080(370/450 watts)                   167fps      144fps      87fps

    EVGA FTW3 Ultra(400/450 watts)                 169fps      154fps      96fps

               

    Horizon Zero Dawn

    ASUS Strix 3080(370/450 watts)                   134fps      118fps       80fps

    EVGA FTW3 Ultra(400/450 watts)                 155fps      138fps      88fps

     

    Assassin's Creed Odyssey

    ASUS Strix 3080(370/450 watts)                   107fps      82fps         64fps

    EVGA FTW3 Ultra(400/450 watts)                 106fps      89fps         72fps

     

    Two of the games are CPU limited at 1080p.

     

  5. I did SLI with GTX 1080 tis that were about the same power as the 2080.

    Over time it got to be a pain to use and for its last year of life it was productivity only.

    The setup was replaced with a single 2080 ti.

     

    22 minutes ago, Naijin said:

    Don't do SLI, the 2080 SLI is in almost all cases of gaming slower than an RTX 2080 TI so basically anything is a better investment than that.

    Unless you are doing some form of editing/... where you need those cores and SLI actually scales well.

     

    EDIT: here's some benchmarks I found 

     

    That is one of the reasons it became a pain to use.

    Remembering when to switch it off.

     

  6. 1 hour ago, Motifator said:

    I realize that, but the melting point of the average mid - high to high end PSU cable is indeed around 400-450W in case an internal short happens / something goes south. 

    You're ultimately causing an external issue by using adapters like in the OP's case, that's not the problem of the PSU itself but the method. Also, it really doesn't matter that the cable carries two plugs on it... unless you have a pot on the card and are pulling an enormous amount of power.

    It's probably another factor such as the PSU itself being crap, or something else causing it. These cables can withstand intense overclocking sessions to no end realistically. The cables are almost never the reason, but something else is. You can have the PSU blown and cables burnt, then blame it on the cables when it was something entirely else.

    The PSU was a Corsair 850 I bought in 2015. It was my first modular PSU.

    The point of failure was at the connector, on the side facing the GPU.  I only noticed it after unplugging the GPU.

    I replaced all the cables with a sleeved kit. 

     

    I do not have a lot of faith with the cables that come with PSUs.

    Since 2019 I have had 2 cable failures out of 7 builds.

    One was the 24 pin on an EVGA 1300 G2. The other was a PCIe cable on a 1000 watt EVGA G+.

    It took over 6 months for the issues to come up and I consider the cause to be poor quality control.

     

    I also have setups that use more than 150 watts on an 8 pin but there is nothing budget about them.

    One of my 3090 tis has a 12 pin to 2x 8 pins(2 pins blank) cable with the instructions to use it with a 1000 watt PSU. 

    It is beautifully made and I can trust it to do the job.

     

  7. 5 minutes ago, Motifator said:


    Those ratings are horse shit. The cable is well capable of pulling way more without even coming close to burning. They're very conversely rated. Old dual GPU cards were an example of this.

    For instance my 1500i has the OCP at 40a on each cable aka 420W, you'd think Jonnyguru would know what he's doing, right? Otherwise he'd limit them to 150W.

    Cables are not all the same. Some are junk.

    I had an GTX 980 ti SLI rig. Each 980 ti draw 250 watts. I used one cable(8 pin plus 6 pin) for each 980 ti and lost one and the motherboard. 

    Since then I go by those "horse shit" ratings.

     

     

     

  8. It depends on what 3080 you have.

    An 8 pin connector is rated at 150 watts. A PCIe slot is 75 watts. So 375 watts.

     

    Some cards(Tuf) draw 340 watts, others(Strix) draw 370 watts and some 400 watts(FTW3).

     

    I lost a card and a motherboard with a PCIe cable so I play it safe.

     

     

  9. I would go with a PCIe 4 SSD. In some games it does make a big difference and well worth the extra money.

     

    The case is not a good design for a 3080 ti. I would go with a case that has intake below the GPU. I like Lian Li o11 dynamic.

     

    For cooling a 5800x I would use a D-15s or a 360mm AIO. Mine was on a 360mm AIO and it idled in the low 30s and gamed in the 50 and 60s. 

  10. 1 hour ago, JimBoobJovi said:

    Hi, I ran it in debug mode and ran same benchmark. FPS still fluctuates a lot and low score in the end almost like it’s being limited by 15-20%. In games still getting micro stutters. Was thinking of bringing the card to a friends house to see if issue is same with his PC when I put the card in.

    See if the card is power starved.

    I use GPUZ for this.

    Here is the first stock Time Spy run I did with a 3090 ti to see if is running cool and using the proper amount of power. Mine is a 450 watt card stock and it used 455 watts.

    I am monitoring the CPU and motherboard vrms as well. They to can cause stutter if they get too hot.

    Spoiler

    TimeSpyStock.thumb.png.6dd346064db60cea178d3f97f1fb17ad.png

  11. 8 hours ago, Toolhead1987 said:

    I thought I struck gold when I walked in to BestBuy last Summer and they had a single MSI 3080ti Gaming X Trio in stock.
    $1750 plus taxes $1950 irrc), I sold my 2080ti while I was in Germany for 1000 ($1200) euro so it wasn't a complete loss...
    but looking at BestBuy now they have 3080ti's for $929.
    Kill me.


     edit: sorry it was a 2080ti I sold got my gpus mixed up**

    I bought the same card for the same price.

     

    Now the card is in my game server and was replaced with a 3090 Gaming X Trio. 

    It is a bit of a waist but I sold all my 10 series cards that usually do those types of jobs. 

    7 hours ago, Toolhead1987 said:

    yeah that's true.
    I have been happy with the card itself.
    I use it with a LG C1 48" games really look amazing.
    The only problem is I almost exclusively play War Thunder lol.
    in all my years I just never saw GPU prices decline within a single generation like that.
    I think I sold my 1080ti for $300 the week I got my 2080ti and that already felt like a rip-off. 

    I bought mine to use with a LG OLED as well. Needed that 4k 120hz goodness.

     

    My 3x 1080 tis sold for about $485 each at the end of last year. One of my 2080 tis went as well for &1,100.

     

    It is a market so it is best to sell high and buy low. 

    I still have two 2080 tis that did not get replaced before the price drop so they will sit in their boxes for now.

  12. I did 3D for a living and for a Ryzen setup I would use the system I have listed below.

     

    CPU: 5900X

    Motherboard: X570 AORUS Master

    RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200

    GPU: EVGA FTW3 ULTRA  RTX 3080 ti

    PSU: EVGA 1000 G+

    Case: Lian Li O11 Dynamic

    Cooler: EK 360mm AIO

    SSD#1: Corsair MP600 1TB

    SSD#2: Crucial MX500 2.5" 2TB

     

    With long overnight CPU renders even with a cool CPU like the 5900x the motherboard may overheat in a hot climate. 

    I like AORUS Masters for this but will us an Ultra as well.

    The 5900x does not render hot but does idle hot so I use a 360mm AIO.

     

    For a GPU rendering only rig, I would use Intel with enough cores to make Photoshop happy and that is about 8. 

    Intel's are cooler when not rendering and use about half the watts at idle. 

    For Intel I use AORUS Masters motherboards as well. I have ASUS boards but I will not use less than a Hero since I have had issues when I did. Usually the issues were CPUs running hot. 

     

    GPUs can get hot rendering so I now like cases with air intakes below the the GPU like the Lian Li O11 Dynamic. For a more traditional setup I like to use cases with 200mm fans.  

     

    With GPU rendering, vram becomes an issue so the more the better. 

    I used 11gbs for years and 12gbs was not an upgrade. The 3090 with 24gbs was a good upgrade so you may want to wait for prices to come down in your area of the world.

     

    I did 3D design so a about 7 to 200 renderings per project. With that storage was not an issue so 1 or 2tb SSDs. 

    Slow storage is an issue so since 2007 I have not used any 72,000rpm HDDs. If I have to use them they are remote like on a server or USB drive. 

    A texture that takes 1 second to load from a SSD takes 5 seconds from 72,000rpm HDD and that can add up in a complex scene.

    Also most consumer HDDS are designed to move small amounts of data fast but will choke moving large amounts. This can cause errors and crashes, usually in the middle of the night after multiple renders.  

     

    Heat over time is what caused most issues so I never overclocked my rendering rigs and today with hotter components I would probably be undervaluing them.

      

  13. Just now, Hunlight said:

    yeah I love ultrawide both for movies and gaming. just must wait for DLSS support for it.

    I started using one in 2015. It was a 60hz 3440 X 1440 Samsung. I replaced in 2019 with the LG.

     

    In 2020 my Son wanted more desk space so I switched out his OLED for my ultrawide.

    I have it back now but it is in my test bench area with the i9 9900k. 

     

    For movies I like the OLEDs so for going back to ultrawides I need that technology. The market seems to be going that way so it is only a matter of time. 

  14. 33 minutes ago, Hunlight said:

    yeah I tried BF5 with 700 and 130 set up. was mostly constant 2070-2050MHz and around mid to low 60C. not sure about fps probs margin difference, but clocks are around +100~ from the auto OC. Sadly could not test raytracing with DLSS. apparently BF5 still using dlss 1.0 as 1.0 does not support ultrawide and its greyout for me and cannot turn it on 😞

    Ultrawide can be a pain but I still prefer it over 4k in some games.  

     

     I am still waiting for a 5120×2160 120hz model to come out with OLED type screen. 

     

    My LG 38" 3840 X 1600 75hz monitor looks low rent compered to the OLEDs so I sometimes use them for ultrawide.

    Spoiler

    20210131_180650.thumb.jpg.f6422e893dc23e395142a51acc8a378a.jpg

     

  15. 1 hour ago, Numik said:

    I can't find anything on 12900k vs mods, but from what I've seen when it comes to Win11 it makes no difference to Win10 in games I play (or the very few I could find any info on).

    Thanks for the info.

     

    1 hour ago, Numik said:

    I was planning on getting new PC next year, but the lags I get in some modded games are just driving me crazy.

    It is the reason I retired my i9 9900k from gaming even though it does fine with vanilla games.

    1 hour ago, Numik said:

    With GPU I can wait for RTX40 series, since FPS is not an issue. But when I have to wait for several seconds for something to happen because CPU is maxed out, I rather spend money right now.

    Some of my modded games use more than 12gbs of vram so 3090s are a must. They do 4k 120hz on my OLEDs now so I have no need for 40 series cards.  

    1 hour ago, Numik said:

    I'll go with 12900k then. Should last me longer then 12700 or 12600. Now to just get a bracket for my Kraken x70 for LGA 1700 socket.

    For my modded games I have always used PassMarks single thread bench to pick CPUs.

    As you can see the 12900k is the winner. https://www.cpubenchmark.net/singleThread.html

    My i9 11900k is way down the list and does fine so you will have lots of overhead for future modding.

     

  16. 1 hour ago, Hunlight said:

    Good. thanks for info will give it a try. 

    -----EDIT------

     

    Tried to set 140 and 800. runned superposition 4k optimzied and got blackscreen. guess not stable. so lowered to 130 and 700 to be sure. runned perfectly smooth got 15480 score which is higher than it was and to be sure run superposition in the game mode, MHz hel on to over 2000 almost whole time (started from 2070 dropped to 2010 when temps got to 72C so result is good. will go test in games for "real life" performance.

    I found that I had to lower clocks for RT games like Control and CP2077 but regular games were fine. 

     

     

  17. 5 hours ago, Hunlight said:

    I compared it with other 3600 + 3080 and it is above average even with my sloppy gpu OC and no cpu OC. so should be fine. I just wonder if going further on gpu like manually setting gpu clock and instead of +200 mem doing smth like 500-700 whatever possible is worth it?

    My 2080 ti used +800 on memory for about 2 years without issues so that is where I started with the 30 series. None went pasts +1100 without artifacts.  

    On the core all did fine at +100 but had issues at +150. So +145 was about it.

     

    If I needed the frames +100 on the core and +800 on memory is what I would use since it was stable on the 3080, 3080 ti and 3090, MSI EVGA and ASUS.

     

  18. In a game like Battlefield V I would get 1995 but in other games like Shadow of the Tomb Rader it was around 1785 to 1800. Then there are games like Assassin's Creed Odyssey that hit 1950 and stay there with the % going up and down. 

     

    I don't undervote my 30 series cards for the same reason I don't overclock and that is I don't have a reason to. 

     

    I did test it with the EVGA XC3 Ultra 3080 ti but it was not smooth. I think that card needs all the volts it can get.

    Your Suprim being a premium card would do a lot better.

  19. I don't have the Gigabyte for the reasons you stated but I did get the EVGA(FTW3 Ultra) version and it is 10c cooler then my FTW3 Ultra 3080 ti with the vram 20c cooler. It is on average 10 frames quicker than my 3090 at 4k.

     

    I was so happy with the first one I ordered a second. I payed $1,500 at the EVGA store.

     

     

×