Jump to content

jones177

Member
  • Posts

    3,560
  • Joined

  • Last visited

Everything posted by jones177

  1. Then I would return it and roll the dice again with another card. It is a buyers market.
  2. One of my FTW3 Ultra 3090 tis did not work properly with a cable that was fine with a RTX 3080 ti. The 3080 ti had the same issue with a cable that was fine with a 2080 ti. I would test the card in another system or at Micro Center to see it it is your system or the card.
  3. I was not unhappy with the power of the Strix 3080. I did sell it for a lack of vram. If it was a 12gb version I would still have it. The 3080 ti is not a big jump from a 3080. Only the 3090 ti feels like an upgrade to me. I have an i7 8086k. It is using an MSI Gaming X Trio 3080 ti. Compared to the i9 9900k it dose fine as long as the game does not use the extra cores. Shadow of the Tomb Raider does us cores so this is how it looks. SOTTR with a 3080 ti 1080p 1440p 4K i7 8086k(6 core) 156fps 147fps 95fps i9 9900k(8 core) 169fps 154fps 96fps i9 10900k(10 core) 189fps 164fps 97fps R9 5900x(12 core) 211fps 175fps 104fps At 4k it does fine against the more modern CPUs.
  4. I have a FTW3 Ultra 3080 ti and did test it against a Strix 10gb 3080. The Tuf should be close to the 10gb Strix. This is how it did with a i9 9900k. Shadow of the Tomb Raider 1080p 1440p 4K ASUS Strix 3080(370/450 watts) 167fps 144fps 87fps EVGA FTW3 Ultra(400/450 watts) 169fps 154fps 96fps Horizon Zero Dawn ASUS Strix 3080(370/450 watts) 134fps 118fps 80fps EVGA FTW3 Ultra(400/450 watts) 155fps 138fps 88fps Assassin's Creed Odyssey ASUS Strix 3080(370/450 watts) 107fps 82fps 64fps EVGA FTW3 Ultra(400/450 watts) 106fps 89fps 72fps Two of the games are CPU limited at 1080p.
  5. I did SLI with GTX 1080 tis that were about the same power as the 2080. Over time it got to be a pain to use and for its last year of life it was productivity only. The setup was replaced with a single 2080 ti. That is one of the reasons it became a pain to use. Remembering when to switch it off.
  6. The PSU was a Corsair 850 I bought in 2015. It was my first modular PSU. The point of failure was at the connector, on the side facing the GPU. I only noticed it after unplugging the GPU. I replaced all the cables with a sleeved kit. I do not have a lot of faith with the cables that come with PSUs. Since 2019 I have had 2 cable failures out of 7 builds. One was the 24 pin on an EVGA 1300 G2. The other was a PCIe cable on a 1000 watt EVGA G+. It took over 6 months for the issues to come up and I consider the cause to be poor quality control. I also have setups that use more than 150 watts on an 8 pin but there is nothing budget about them. One of my 3090 tis has a 12 pin to 2x 8 pins(2 pins blank) cable with the instructions to use it with a 1000 watt PSU. It is beautifully made and I can trust it to do the job.
  7. Cables are not all the same. Some are junk. I had an GTX 980 ti SLI rig. Each 980 ti draw 250 watts. I used one cable(8 pin plus 6 pin) for each 980 ti and lost one and the motherboard. Since then I go by those "horse shit" ratings.
  8. It depends on what 3080 you have. An 8 pin connector is rated at 150 watts. A PCIe slot is 75 watts. So 375 watts. Some cards(Tuf) draw 340 watts, others(Strix) draw 370 watts and some 400 watts(FTW3). I lost a card and a motherboard with a PCIe cable so I play it safe.
  9. I would go with a PCIe 4 SSD. In some games it does make a big difference and well worth the extra money. The case is not a good design for a 3080 ti. I would go with a case that has intake below the GPU. I like Lian Li o11 dynamic. For cooling a 5800x I would use a D-15s or a 360mm AIO. Mine was on a 360mm AIO and it idled in the low 30s and gamed in the 50 and 60s.
  10. I hit the PSC key and use Paint 3D(CTRL V) to crop and save. For games I record them with GeForce Experience and use the PSC key to take a still then it is Paint 3D for the save.
  11. See if the card is power starved. I use GPUZ for this. Here is the first stock Time Spy run I did with a 3090 ti to see if is running cool and using the proper amount of power. Mine is a 450 watt card stock and it used 455 watts. I am monitoring the CPU and motherboard vrms as well. They to can cause stutter if they get too hot.
  12. I bought the same card for the same price. Now the card is in my game server and was replaced with a 3090 Gaming X Trio. It is a bit of a waist but I sold all my 10 series cards that usually do those types of jobs. I bought mine to use with a LG OLED as well. Needed that 4k 120hz goodness. My 3x 1080 tis sold for about $485 each at the end of last year. One of my 2080 tis went as well for &1,100. It is a market so it is best to sell high and buy low. I still have two 2080 tis that did not get replaced before the price drop so they will sit in their boxes for now.
  13. I did 3D for a living and for a Ryzen setup I would use the system I have listed below. CPU: 5900X Motherboard: X570 AORUS Master RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 GPU: EVGA FTW3 ULTRA RTX 3080 ti PSU: EVGA 1000 G+ Case: Lian Li O11 Dynamic Cooler: EK 360mm AIO SSD#1: Corsair MP600 1TB SSD#2: Crucial MX500 2.5" 2TB With long overnight CPU renders even with a cool CPU like the 5900x the motherboard may overheat in a hot climate. I like AORUS Masters for this but will us an Ultra as well. The 5900x does not render hot but does idle hot so I use a 360mm AIO. For a GPU rendering only rig, I would use Intel with enough cores to make Photoshop happy and that is about 8. Intel's are cooler when not rendering and use about half the watts at idle. For Intel I use AORUS Masters motherboards as well. I have ASUS boards but I will not use less than a Hero since I have had issues when I did. Usually the issues were CPUs running hot. GPUs can get hot rendering so I now like cases with air intakes below the the GPU like the Lian Li O11 Dynamic. For a more traditional setup I like to use cases with 200mm fans. With GPU rendering, vram becomes an issue so the more the better. I used 11gbs for years and 12gbs was not an upgrade. The 3090 with 24gbs was a good upgrade so you may want to wait for prices to come down in your area of the world. I did 3D design so a about 7 to 200 renderings per project. With that storage was not an issue so 1 or 2tb SSDs. Slow storage is an issue so since 2007 I have not used any 72,000rpm HDDs. If I have to use them they are remote like on a server or USB drive. A texture that takes 1 second to load from a SSD takes 5 seconds from 72,000rpm HDD and that can add up in a complex scene. Also most consumer HDDS are designed to move small amounts of data fast but will choke moving large amounts. This can cause errors and crashes, usually in the middle of the night after multiple renders. Heat over time is what caused most issues so I never overclocked my rendering rigs and today with hotter components I would probably be undervaluing them.
  14. I started using one in 2015. It was a 60hz 3440 X 1440 Samsung. I replaced in 2019 with the LG. In 2020 my Son wanted more desk space so I switched out his OLED for my ultrawide. I have it back now but it is in my test bench area with the i9 9900k. For movies I like the OLEDs so for going back to ultrawides I need that technology. The market seems to be going that way so it is only a matter of time.
  15. Ultrawide can be a pain but I still prefer it over 4k in some games. I am still waiting for a 5120×2160 120hz model to come out with OLED type screen. My LG 38" 3840 X 1600 75hz monitor looks low rent compered to the OLEDs so I sometimes use them for ultrawide.
  16. Thanks for the info. It is the reason I retired my i9 9900k from gaming even though it does fine with vanilla games. Some of my modded games use more than 12gbs of vram so 3090s are a must. They do 4k 120hz on my OLEDs now so I have no need for 40 series cards. For my modded games I have always used PassMarks single thread bench to pick CPUs. As you can see the 12900k is the winner. https://www.cpubenchmark.net/singleThread.html My i9 11900k is way down the list and does fine so you will have lots of overhead for future modding.
  17. I found that I had to lower clocks for RT games like Control and CP2077 but regular games were fine.
  18. You get a 3090 ti now and replace it with a 4090, then a 4090 ti and then a 5090. That's my plan.
  19. My 2080 ti used +800 on memory for about 2 years without issues so that is where I started with the 30 series. None went pasts +1100 without artifacts. On the core all did fine at +100 but had issues at +150. So +145 was about it. If I needed the frames +100 on the core and +800 on memory is what I would use since it was stable on the 3080, 3080 ti and 3090, MSI EVGA and ASUS.
  20. Time Spy likes cores and frequency so not a good bench for a 3600. Here is the Strix with a i7 8086k(6 core). https://www.3dmark.com/3dm/67477054? Here it is with an i9 9900k. https://www.3dmark.com/3dm/64110353? I think it was with +100 on the core and +800 on the memory. The 5ghz Intels do elevate the GPU score but not by much.
  21. In a game like Battlefield V I would get 1995 but in other games like Shadow of the Tomb Rader it was around 1785 to 1800. Then there are games like Assassin's Creed Odyssey that hit 1950 and stay there with the % going up and down. I don't undervote my 30 series cards for the same reason I don't overclock and that is I don't have a reason to. I did test it with the EVGA XC3 Ultra 3080 ti but it was not smooth. I think that card needs all the volts it can get. Your Suprim being a premium card would do a lot better.
  22. The Strix and the Suprim should perform about the same since they are both 370 watt cards. My average is only 1 frame higher so within the margin of error.
  23. Looks fine to me. I did 15384 with a Strix 3080 and a stock i9 9900k. A CPU upgrade my get you more frames but not at 4k.
  24. I don't have the Gigabyte for the reasons you stated but I did get the EVGA(FTW3 Ultra) version and it is 10c cooler then my FTW3 Ultra 3080 ti with the vram 20c cooler. It is on average 10 frames quicker than my 3090 at 4k. I was so happy with the first one I ordered a second. I payed $1,500 at the EVGA store.
×