Jump to content

I'm buying a 3090, and this is my logic

Recon801

SLI is dead/dying because they couldn't make it as practical as focusing on single card solutions for most problems, the only place with niche use of SLI is in high vram usage so memory is used more efficiently/there's more of it.

Level 2 Tech Support for a Corporation servicing over 12,000 users and devices, AMA

Desktop - CPU: Ryzen 5800x3D | GPU: Sapphire 6900 XT Nitro+ SE | Mobo: Asus x570 TUF | RAM: 32GB CL15 3600 | PSU: EVGA 850 GA | Case: Corsair 450D | Storage: Several | Cooling: Brown | Thermal Paste: Yes

 

Laptop - Dell G15 | i7-11800H | RTX 3060 | 16GB CL22 3200

 

Plex - Lenovo M93p Tiny | Ubuntu | Intel 4570T | 8GB RAM | 2x 8TB WD RED Plus 

Link to comment
Share on other sites

Link to post
Share on other sites

A twin SLI 3090 24GB setup is going to net you more performance than a 3080 10GB. One can argue how much more, but it is going to be more.

I assume you are not budget constrained, so twin 3090 is IMO the setup that is going to net you the absolute highest amount of GPU performance possible with consumer hardware. That setup is only going to be beaten by an hypotetical single/twin 3090 Ti 48GB or by a leprechaun AMD card.

If you want the most GPU that money can buy, twin 3090 24GB is the way to go. You can build an incredible system with two GPU waterblocks.

 

If you value efficiency and can wait, a possible 3080 20GB would surely dispell all your vram worries.

AMD also might come along with an hypotetical GPU with 16GB of vram and performance to possibly compete with a 3070 or 3080. Big navi likely won't bring back crossfire, but with PCI4.0 multigpu low level api can take advantage of multiple GPU to an extent.

 

I wouldn't buy SLI 3090, but that's me. If you want the best, twin SLI is going to give you the best for at least six months or more and will last you a long time.

Link to comment
Share on other sites

Link to post
Share on other sites

SLI as you know it is officially dead. The 3090 does not support it. It only supports DX12/Vulkan Multi-GPU which currently works in exactly 14 games.

 

Given that Multi GPU is exclusive to $3000 video cards it is highly unlikely game developers will spend time coding for it.

 

https://nvidia.custhelp.com/app/answers/detail/a_id/5082

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Lord Bloobus said:

the only place with niche use of SLI is in high vram usage so memory is used more efficiently/there's more of it.

SLI dont scale memory, multi GPU and NVLink does

Multi GPU depends on software, NVLink is only for quadro cards, Geforce uses the same connector but run on SLI mode

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Another thing I just thought of is the economics behind SLI. I would also say it's not going to stay viable since MOST consumers are not willing money for 2 high end graphic cards even if it gives a decent performance boost. Most people just won't want to dig that deep or can't dig that deep at once to be able to afford 2 high end GPUs at once, thats required for SLI. Perhaps its a viable purchase now with the 980 TI but the fact is, new cards are way to expensive for enough consumers to buy 2 graphics cards that SLI will die out due to lack of people using it.

Link to comment
Share on other sites

Link to post
Share on other sites

Depending on what amd brings to the table NVIDIA seems to have a 16gb or 20 gb 3080 planned if needed (more or less confirmed by AIB leaks). If this comes in at under or at 1k it could be an option that brings much better value and will age better than the current 3080. Can you wait a few months? 

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Recon801 said:

Regarding the price, I'm assuming a purchase of a 3090 at launch for $1,500, and then the purchase of a used one at the 50 series launch for a boost in performace. Graphics cards typically sell for about 25% value after the 2nd launch. For example, a Pascal Titan X msrp was $1,200, but now that the 30 series is releasing you can purchase one for $290-350 used.

 

While I do make projections, I believe they are very realistic given past launches and interations from Nvidia. The "leak" regarding a 20gb 3080 is from a single unreliable source, and several people have simply repeated it. It doesn't make sense for Nvidia to release a 20gb 3080 as it would make the 3090 obsolete immediately.

 

Regarding SLI, do you run SLI. Because I always have, and I'm telling you all the major games launches do support it. Call of Duty, Battlefield, Ark, Mass Effect, Fallout, Anthem, Red Dead. I would in fact be interested if you could find any major release that doesn't support sli. I'm sure there's one out there, but the notion that "SLI is dead" is simply not true. Excellent scaling to, I'd say I average about 60-70% like I say previously, but I get 100% in several titles as well.

Not necessarily true as we go forward. Most people play at 1080p, and for that you don't need that powerful of a card. Thus, the higher end cards will hold their value for far longer, and that's been shown true with the 1080Ti, a card which has only lost about half of it's value over the same time frame you've stated.

 

The 20GB leak is certainly not from a single source. It's even been shown on Gigabyte product sheets. It also wouldn't make the 3090 obsolete as it has other things than just VRAM differences, it's a more powerful card. With that logic, the 2080 would have dethroned the 2080Ti, since their VRAM numbers were similar.

 

I don't have to run SLI to know that it's not worth it. I can simply watch virtually every creator that has said it's a poor choice. You know, people that work with tech for a living, rather than one person, who sunk the money into it had would thus have bias towards it. Have you ran proper benchmarks to come to your claimed figures, or are you just formulating numbers out of thin air?

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Recon801 said:

Who ever cares about ray tracing, its still barely supported by any games. You want to talk about dead, ray tracing is dead, not SLI. And in the tiny handful of games that do support it, it kills your fps so much that you wont use it.

Denying new tech when talking about how you would deal with future upgrade plans. Ironic.

 

17 hours ago, Recon801 said:

I literally play games right now, today in 2020, and get 100% scaling. If I can go from 40fps to 80fps that is not dead. Not even close.

Then why not get a pair of used 2080Ti for SLI? Faster than a 3090 at the cost of a 3080 (more VRAM too) and widely available.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/19/2020 at 2:44 PM, Recon801 said:
.

it's really just a matter of how much you are willing to pay for the fastest single card, or for each % the 3090 is better than the 3080, how much are you paying. If it's 10% then i personally won't get it, but 15% is borderline. Another thing to consider is how the best 3080 oc compares to the best 3090 oc and actual performance under those conditions.

 

SLI is dead, anything (exc. 3090) u can SLI gets trashed by the 3080 without the bugs (imagine 2080 ti sli vs 3080). 

 

I'd suggest seeing Control maxed out (in person) with RT on without DLSS before trashing on ray tracing, DLSS is free 30-60% boost to fps for those who don't mind a little blur.

 

Waiting on 3090 benchmarks.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/19/2020 at 3:53 PM, MadPistol said:

What monitor do you have?

SLI is dead because as of January 2021, Nvidia will no longer be making SLI profiles. 

That clearly does not mean that SLI is dead. The article for last week makes it clear that DX12 and Vulkan avail the advantages without the need for the profiles. A tech does not need to be mainstream to be influential, in fact influencing the future rarely is mainstream.

VR Snob ... looking for ultimate full-power full-portable no-compromise VR Box ... Streacom's DA2 starting to look good ...

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Recon801 said:

If you have a 1080ti right now, you shouldn't buy a 3080, you should buy another 1080ti. Y

Uh no, don't base yourself on these fake videos it's just a montage. SLI is something of the past.

 

Everyone who had a 1080 Ti could have waited the 3000 series and now the 3080 is here... it's a good upgrade but since this is nvidia if the 1080 Ti still does the trick might as well wait for a 3080 Ti or 3080 Super still.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Recon801 said:

Honestly, the thing that has shocked me more than anything on this thread is the massive lack of knowledge regarding SLI.

Jay talks about SLI repeatedly in this. You should watch it.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, comander said:

1. It's not JUST the cost of the card, it's the cost of the card PLUS interest. If your capital cost is 5%, then the extra $800 at 5% per year means $40 worth of extra costs.

Point taken and technically correct, though based on the OP's logic I really dont think the alternative was putting the money into an index fund for growth vs buying 2x 3090's, just sayin :) 

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/20/2020 at 12:09 PM, dizmo said:

I don't have to run SLI to know that it's not worth it. I can simply watch virtually every creator that has said it's a poor choice. You know, people that work with tech for a living, rather than one person, who sunk the money into it had would thus have bias towards it. Have you ran proper benchmarks to come to your claimed figures, or are you just formulating numbers out of thin air?

I'm not formulating numbers out of thin air, you are welcome to watch any video online comparing sli performance against other generations, I have already posted examples in this thread. What graphics card do you have?

 

I know from my own testing, and others that sli 980ti offers equal performance to a 2080ti.

I know from my own testing and others that sli 1080ti offers equal performance to a 3080.

 

20 hours ago, BTGbullseye said:

Jay talks about SLI repeatedly in this. You should watch it.

lol, I appreciate the link, but I'm not going to watch a 5 hour video to find a comment on sli. Maybe you can give a rough timestamp.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Recon801 said:

lol, I appreciate the link, but I'm not going to watch a 5 hour video to find a comment on sli. Maybe you can give a rough timestamp.

If you start at the 4 hour mark, they talk about it every other superchat they answer.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, BTGbullseye said:

If you start at the 4 hour mark, they talk about it every other superchat they answer.

Thanks, I'll check it out.

Link to comment
Share on other sites

Link to post
Share on other sites

SLI support will only continue to be deprecated over time so should not be seen as any long-term feature for gaming in your logic in my opinion. It is knocking on death's doors.

 

A lot of your post clearly indicates you are making assumptions.

 

It essentially boils down to this: Do you want to spend 114% more money over the 3080 for maybe at best 10% more performance and 24GB GDDR6X? The memory alone won't carry you to the 60-series if we figure generational cycles are 2 years now. That's 6 years. The 3090 will have other more basic performance issues before the vRAM is saturated in my opinion.

 

It's a poor value and really the only logic I can see is NVLink 2 of them for maybe non-gaming purposes as cheap Quadros or something, but otherwise it's just pissing money away for minimal gaming gains. I see no logic in buying one, let alone two of them for gaming at their exorbitant price over the 3080.

 

EDIT:

On 9/19/2020 at 3:44 PM, Recon801 said:
Comparison at 60series release in 2026/2027:
A) 3090 now, with sli upgrade: performance 1.9 with 24gb vram - -Total Cost: $1,900 (Better performance per dollar, more or equal amount of vram)
B) 3080 now, with 4080or5080 upgrade, and then 6080 upgrade: performance 1.95 with 20/24gb vram - -Total Cost: $2,100-2,200

  

Considering in gaming, leaks suggest it is not much better than 3080, I don't see it carrying you any further than a 3080 would. Plus there will be other consequences for the minimal increase in performance in power draw and those over time costs shouldn't be ignored either in my opinion. But that is just my opinion.

 

If vRAM is the main concern, which I admit would be for me as well at 4k a few years from now, I think a potential 20GB 3080 makes more sense for gaming over the 3090. But that also depends on how much more a 20GB 3080 would be.

Zen 3 Daily Rig (2022 - Present): AMD Ryzen 9 5900X + Optimus Foundations AM4 | Nvidia RTX 3080 Ti FE + Alphacool Eisblock 3080 FE | G.Skill Trident Z Neo 32GB DDR4-3600 (@3733 c14) | ASUS Crosshair VIII Dark Hero | 2x Samsung 970 Evo Plus 2TB | Crucial MX500 1TB | Corsair RM1000x | Lian Li O11 Dynamic | LG 48" C1 | EK Quantum Kinetic TBE 200 w/ D5 | HWLabs GTX360 and GTS360 | Bitspower True Brass 14mm | Corsair 14mm White PMMA | ModMyMods Mod Water Clear | 9x BeQuiet Silent Wings 3 120mm PWM High Speed | Aquacomputer Highflow NEXT | Aquacomputer Octo

 

Test Bench: 

CPUs: Intel Core 2 Duo E8400, Core i5-2400, Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS

Motherboards: ASUS Z97-Deluxe, EVGA Z490 Dark, EVGA Z790 Dark Kingpin

GPUs: GTX 275 (RIP), 2x GTX 560, GTX 570, 2x GTX 650 Ti Boost, GTX 980, Titan X (Maxwell), x2 HD 6850

Bench: Cooler Master Masterframe 700 (bench mode)

Cooling: Heatkiller IV Pro Pure Copper | Koolance GPU-210 | HWLabs L-Series 360 | XSPC EX360 | Aquacomputer D5 | Bitspower Water Tank Z-Multi 250 | Monsoon Free Center Compressions | Mayhems UltraClear | 9x Arctic P12 120mm PWM PST

Link to comment
Share on other sites

Link to post
Share on other sites

DCS VR. 

 

I need more that 10Gb

 

I also need GSync, RTX and DLSS.

 

goddammit.

9900K  / Asus Maximus Formula XI / 32Gb G.Skill RGB 4266mHz / 2TB Samsung 970 Evo Plus & 1TB Samsung 970 Evo / EVGA 3090 FTW3.

2 loops : XSPC EX240 + 2x RX360 (CPU + VRMs) / EK Supremacy Evo & RX480 + RX360 (GPU) / Optimus W/B. 2 x D5 pumps / EK Res

8x NF-A2x25s, 14 NF-F12s and a Corsair IQ 140 case fan / CM HAF Stacker 945 / Corsair AX 860i

LG 38GL950G & Asus ROG Swift PG278Q / Duckyshine 6 YOTR / Logitech G502 / Thrustmaster Warthog & TPR / Blue Yeti / Sennheiser HD599SE / Astro A40s

Valve Index, Knuckles & 2x Lighthouse V2

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Sir Beregond said:

SLI support will only continue to be deprecated over time so should not be seen as any long-term feature for gaming in your logic in my opinion. It is knocking on death's doors.

 

A lot of your post clearly indicates you are making assumptions.

 

It essentially boils down to this: Do you want to spend 114% more money over the 3080 for maybe at best 10% more performance and 24GB GDDR6X? The memory alone won't carry you to the 60-series if we figure generational cycles are 2 years now. That's 6 years. The 3090 will have other more basic performance issues before the vRAM is saturated in my opinion.

 

It's a poor value and really the only logic I can see is NVLink 2 of them for maybe non-gaming purposes as cheap Quadros or something, but otherwise it's just pissing money away for minimal gaming gains. I see no logic in buying one, let alone two of them for gaming at their exorbitant price over the 3080.

 

EDIT:

  

Considering in gaming, leaks suggest it is not much better than 3080, I don't see it carrying you any further than a 3080 would. Plus there will be other consequences for the minimal increase in performance in power draw and those over time costs shouldn't be ignored either in my opinion. But that is just my opinion.

 

If vRAM is the main concern, which I admit would be for me as well at 4k a few years from now, I think a potential 20GB 3080 makes more sense for gaming over the 3090. But that also depends on how much more a 20GB 3080 would be.

My only concern with the 3080 is the vram. 10gb simply isn't enough for 4k, especially when you start introducing mods. Nvidia also restricted its bandwidth to choke performance. 

 

For the moment I'm going to wait for AMD to show their hand, and for Nvidia to respond. There's no availability to upgrade right now anyway.

 

Regarding the 3090 performance over the 3080, we already have benchmarks that show a 15-21% gain over the 3080 in games and synthetics at 4k. Only that initial chinese leak showed a 10% gain.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Recon801 said:

I'm not formulating numbers out of thin air, you are welcome to watch any video online comparing sli performance against other generations, I have already posted examples in this thread. What graphics card do you have?

 

I know from my own testing, and others that sli 980ti offers equal performance to a 2080ti.

I know from my own testing and others that sli 1080ti offers equal performance to a 3080.

Or, we could go with reviews. People that have experience testing hardware. My numbers provided below are based on 1440p performance.

 

1080Ti SLI

  • Didn't work, or had severe issues in almost half the titles tested.
  • Other titles saw an average of a 50% increase. Some were significantly lower.
  • Of the 25 titles tested, only 2 showed greater than 60% scaling, and one was a synthetic benchmark.

https://babeltechreviews.com/gtx-1080-ti-sli-performance-25-games/3/

 

1080Ti vs 3080

  • Most titles see a 70% - 100% increase over the 1080Ti. Since SLI only gives you an average performance increase of 50%, no, 1080Ti SLI does not offer equal performance to the 3080...and that's when SLI is working well. That's ignoring the roughly 50% of titles it doesn't work in or has issues with.

https://babeltechreviews.com/rtx-3080-arrives-ampere-performance-revealed-35-games-benchmarked/4/

 

So, is SLI an improvement? Sure, if it works. But it's nowhere near what you claim.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Radioactive Snowman said:

There are a lot of rumours of a 20gb 3080 coming out in the coming months, you might want to wait and see if they pan out

Rumors are also that it'll be another year before they release.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Recon801 said:

My only concern with the 3080 is the vram. 10gb simply isn't enough for 4k, especially when you start introducing mods. Nvidia also restricted its bandwidth to choke performance. 

This is the common fear that is being thrown around the forums (right in line with "but will I be bottlenecked at PCIE 3.0?!") and the answer is the same...this is incorrect

 

I play everything at 4k, some games with mods.  Have never hit max VRAM (11 gb).  Have not come close yet.   I played a couple games with mods (witcher, skyrim), I play 4k Ultra, triple buffering when it allows me to, I have seriously never come above 10gb.  

 

If it was a thing, there would have been multiple arguments put forth that the Titan RTX might have had some value compared to the 2080ti being that it had 2x VRAM, and maybe that would make it the 4k card...but it turns out they perform almost identically at 4k...because 4k does not require all the VRAM and 11 is plenty.  We would have seen a significant difference if VRAM was an issue here.

 

4k does not require 20+ gb VRAM, please put this myth to rest.  

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, WihGlah said:

DCS VR. 

 

I need more that 10Gb

 

I also need GSync, RTX and DLSS.

 

goddammit.

Apparently initial testing seems to be showing that, for DCS the 3080 shows minimal to no improvement over the 1080 TI right now. It sounds like it is purely CPU single thread performance bound right now, and will likely remain so until they finish refactoring the engine into Vulkan, which is a huge bummer for me. :(

Link to comment
Share on other sites

Link to post
Share on other sites

Here's my logic for getting a 3090. It's quite simple really, the ratioanle to get a 3090. The math behind it is not hard to understand.

 

If you game at 4k ultra. 10 GB is "just enough" VRAM in 2020, but likely won't be in 2021. Even the release of CyberPunk 2077 in November might push you over the edge of 10 GB. Nvidia knows it's an issue, hence why they have the 3070 16 GB and the 3080 20 GB ready to go! Nvidia thinks they can make a product by selling the video cards with more memory than they have now. I bet that they know what there doing, and will make money off of us if they can. Even the value option the 70 class, RTX 3070 16 GB will have more than 10 GB. That's not for enthusiast to have something they don't need, but the majority of customers seeking good value.

 

Cost of a 3080 10 GB in 2020 of $699 + cost of a 3080 20 GB in 2021 > RTX 3090 in 2020
Under the condition that the RTX 3080 20 GB is > $800 (only a price increase of $100), which I think it will be, probably much more like in the $1000 range.

 

Enough said. Would you rather take the hit buying and selling + shipping one graphics card or two? Regardless of that even, If the 3080 20 GB is north of $800 (I think it will be much more than that), it would have been cheaper to just get a 3090.

 

SLI is dead.

 

Even if the 3080 20 GB equals the price of the 3080 10 GB (which there's a slim chance that's going to happen) by the time you buy, sell and ship, two graphics cards vs a single 3090, it would have been cheaper to just get a 3090.

 

There's a slim chance that the 3080 20 GB will be cheaper than the 3080 10 GB, but we all know that's not going to happen. It will likely cost more than $699, adding on $100 would seem like a good place to start, probably won't be less than that, probably much more than that though.

 

Think about it logically here guys. Getting a single 3090 just makes more sense financially instead of getting a 3080 10 GB + a 3080 20 GB.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×