Jump to content

Who actually needs a RTX 3090?

Yoshi Moshi
2 hours ago, MadPistol said:

Let's be honest. No one "needs" a dedicated GPU. Integrated graphics works fine in most cases.

A 3070/3080/3090 is never a "need" proposition. We need food and water to survive. An RTX 3090 is simply nice to have.

Bah...semantics.  We all know the difference here.  😋

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, valdyrgramr said:

The Doom test was complete bs and they used an older driver for the 2080 Ti than the 3080 to mislead as usual.   The only areas you're going to see a real difference in gaming is where RT and/or DLSS are a thing, but Doom Eternal lacks both, so they misled.   Here's actual performance at 4k with a 2080 Ti that's not even as binned as the FE using the same driver they used for the 3080.

 

 

 

Saw more than a few tests running Doom E on 2080Ti's and yeah it was about the same or higher than what Nvidia showed for the 3080.

 

And that was with the settings maxxed out at 4K.

 

So like I have been saying all along it's really going to be the 3080 that will compete with the 2080Ti, not the 3070 like people seem to think.

 

That is actually more believable too.

 

But we will all see soon enough.

 

But we do know RT and DLSS cuts down the frame rates of the 2080Ti quite a bit.... So I would hope the 3080 would beat it with RT and DLSS on... I mean with all of the improvements they said they made with the Gen 2 RT...

 

I do think the 3090 is going to be a Monster though..

i9 9900K @ 5.0 GHz, NH D15, 32 GB DDR4 3200 GSKILL Trident Z RGB, AORUS Z390 MASTER, EVGA RTX 3080 FTW3 Ultra, Samsung 970 EVO Plus 500GB, Samsung 860 EVO 1TB, Samsung 860 EVO 500GB, ASUS ROG Swift PG279Q 27", Steel Series APEX PRO, Logitech Gaming Pro Mouse, CM Master Case 5, Corsair AXI 1600W Titanium. 

 

i7 8086K, AORUS Z370 Gaming 5, 16GB GSKILL RJV DDR4 3200, EVGA 2080TI FTW3 Ultra, Samsung 970 EVO 250GB, (2)SAMSUNG 860 EVO 500 GB, Acer Predator XB1 XB271HU, Corsair HXI 850W.

 

i7 8700K, AORUS Z370 Ultra Gaming, 16GB DDR4 3000, EVGA 1080Ti FTW3 Ultra, Samsung 960 EVO 250GB, Corsair HX 850W.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, valdyrgramr said:

I think you mean Vega 2 which is a Radeon VII.   In Blender the 2080 Ti/RTX Quadro 6000 and VII are on the same tier now depending on which render engine you use.   With improved drivers the VII games closer to the original 2080 while at launch it suffered more from driver issues and was more of a halfway between the 2070 and the 2080.   But, if you're just gaming the 1080 Ti was the better value.

 

The Doom test was complete bs and they used an older driver for the 2080 Ti than the 3080 to mislead as usual.   The only areas you're going to see a real difference in gaming is where RT and/or DLSS are a thing, but Doom Eternal lacks both, so they misled.   Here's actual performance at 4k with a 2080 Ti that's not even as binned as the FE using the same driver they used for the 3080.

 

  Reveal hidden contents

578308552_DoomSetting2.thumb.jpg.c86a78b92b407daa237a08534ef07dea.jpg988077664_Doomsetting1.thumb.jpg.a3cdfc6916ebf806f1d3296ff3b00aaa.jpg916184777_Firetotheface.thumb.jpg.5e2c53ec43c018a4a3a8ef7bf662b04b.jpgInjured.thumb.jpg.c66097816195e4477e26d3a800ce4245.jpgSword.thumb.jpg.4b01ecb9d8aa44925ecf03e0314e4580.jpg

^As you can see, corporations mislead to build hype and up sales of the new product as it's called a marketing strategy.

I very well might have.  I hadn’t heard the term vega2 until yesterday and was confused by what was being talked about. That the two things are the same makes a certain amount of sense

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, realshmalex said:

What I don’t get is why 3080 is 10GB...??

im waiting for 3080(ti) 12~16GB version

What I don't understand is this.

 

RTX 2070 FE = RTX 2070 Super FE = RTX 3070 FE = $499

RTX 2080 FE = RTX 2080 Super FE = RTX 3080 FE = $699

RTX 2080 Ti FE = $1,199

RTX 3090 = $1,499

RTX Titan = $2,499

 

If the price scheme continues where RTX 3XXX = RTX 2XXX in terms of price, then we can expect

RTX 2080 TI FE = RTX 3080 Ti FE/RTX 3080 (20 GB Version) FE = $1,1999

 

If the difference between the RTX 3080 Ti FE/RTX 3080 (20 GB Version) and the RTX 3090 is

$1,499 - $1,199 = $300

Then doesn't it make sense to just spend the extra $300 and get a RTX 3090???

 

Big Navi is rumored to have 16 GB VRAM, so the uniformed consumer will see the 10 GB of VRAM in the RTX 3080, and go with the AMD card. So Nvidia is likely to insert another card in between the RTX 3080 and RTX 3090 (perhaps a RTX 3080 Ti or a RTX 3080 20 GB version). There's a difference of

$1,499 - $699 = $800

Plenty of room for another card in the stack.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Abyssal Radon said:

Here’s the issue with the RTX 3090, the pricing. Thus what you get for $1499.99 is 10496 Cuda cores and 24GB GDDR6X ram. Versus for $699.99 RTX 3080 you get 8702 Cuda cores and 10 GB GDDR6X ram. So essentially your paying more than twice the amount for ~20% increase in Cuda cores and 140% increase in video ram. So for gaming the RTX 3090 is a hard sell, but for content creators there are many situations that requires 24 GB of video ram. Although I am slightly disappointed they didn’t give the RTX 3080 12 GB of video ram. The RTX 3080/3070 is a better proposition for gamers. I am very excited to pick up a 3080, mostly because my 980 Ti is beginning to show it’s age. Though I am not looking forward to staking Newegg so I can get a 3080 and not have to wait months for supply to play catch up...

 

We won't know until after the reviews and tests come out how they all compare.

 

Once the tests are done then we can figure money into and see how it works out..

 

Right know nothing will make any since because we have no idea what cards do what compared to whatever cards.

 

 

 

 

 

 

i9 9900K @ 5.0 GHz, NH D15, 32 GB DDR4 3200 GSKILL Trident Z RGB, AORUS Z390 MASTER, EVGA RTX 3080 FTW3 Ultra, Samsung 970 EVO Plus 500GB, Samsung 860 EVO 1TB, Samsung 860 EVO 500GB, ASUS ROG Swift PG279Q 27", Steel Series APEX PRO, Logitech Gaming Pro Mouse, CM Master Case 5, Corsair AXI 1600W Titanium. 

 

i7 8086K, AORUS Z370 Gaming 5, 16GB GSKILL RJV DDR4 3200, EVGA 2080TI FTW3 Ultra, Samsung 970 EVO 250GB, (2)SAMSUNG 860 EVO 500 GB, Acer Predator XB1 XB271HU, Corsair HXI 850W.

 

i7 8700K, AORUS Z370 Ultra Gaming, 16GB DDR4 3000, EVGA 1080Ti FTW3 Ultra, Samsung 960 EVO 250GB, Corsair HX 850W.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

In the gaming world, VR is the second most taxing thing after Ray Tracing.  DLSS 2.1 VR support will hopefully be a massive boost to VR.  For perspective, my 1080 Ti would probably be able to render the crap out of Half-Life: Alyx on a traditional monitor, but on my Index...my 1080 Ti actually still gives me reprojection issues in some areas of the game at 90Hz (not 120 or 144).  A 2060 Ti might (in theory) be able to do what my 1080 Ti cannot because of DLSS 2.1's purported 70%+ framerate uplift.

 

The other issue is that the Index auto-scales resolution based on hardware performance and chosen refresh rate.  If my Index is set to 80Hz or 90Hz, the Steam VR hub looks great, but at 120Hz or 144Hz it's VERY fuzzy.  My hope is that a 3080/3090 will not only enable higher frame rates that clean up the visuals, but that DLSS 2.1 + all that extra horsepower will make VR substantially easier to drive.  IMO current VR games have to be designed about 5 years behind the graphical fidelity of modern titles in order to run smoothly on modern hardware. 

Link to comment
Share on other sites

Link to post
Share on other sites

Saya ("Me" in Malay). Up until now, Titan class GPU is very prohibitive in terms of cost. If the RTX3090 is indeed a Tita class with FP64 computational that I can use with Catia, ProEngineer, I'm gonna get it. It will pay for itself after 2-3 freelance jobs plus my daily occupation. The RTX2080ti with it's 11GB VRam is okay for now but I do hit GPU memory wall if I load any assemblies larger than say 2 trains of gas compression on an offshore platform. Integrated central processing plus well-head platform is still okay for now albeit a bit slow. 

 

Not to mention finite element calculations are still being done by the CPU. Will wait for reviews though especially from the engineering crowd. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/12/2020 at 4:50 AM, i_build_nanosuits said:

4k high fps, VR...future proofing for a couple years...

I think the 2080Ti is a good example how that can fail... :I

Zen-III-X12-5900X (Gaming PC)

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Ryzen 5 3600, 6-cores, 12-threads, 4.2/4.2GHz, 35,3MB cache (T.S.M.C. 7nm FinFET) / CPU: AMD Ryzen 9 5900X(ECO mode), 12-cores, 24-threads, 4.5/4.8GHz, 70.5MB cache (T.S.M.C. 7nm FinFET) / Display: HP 24" L2445w (64Hz OC) 1920x1200 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: ASUS Radeon RX 6600 XT DUAL OC RDNA2 32CUs @2.6GHz 10.6 TFLOPS (T.S.M.C. 7nm FinFET) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: ASRock B450M Pro4, Socket-AM4 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W / RAM A2 & B2: DDR4-3600MHz CL16-18-8-19-37-1T "SK Hynix 8Gbit CJR" (2x16GB) / Operating System: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Western Digital My Passport 2.5" 2TB HDD / Storage 4: Western Digital Elements Desktop 2TB HDD / Storage 5: Kingston A2000 1TB M.2 NVME SSD / Wi-fi & Bluetooth: ASUS PCE-AC55BT Wireless Adapter (Intel)

 Lake-V-X6-10600 (Gaming PC)

R23 score MC: 9190pts | R23 score SC: 1302pts

R20 score MC: 3529cb | R20 score SC: 506cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: Intel Core i5-10600(ASUS Performance Enhancement), 6-cores, 12-threads, 4.4/4.8GHz, 13,7MB cache (Intel 14nm++ FinFET) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1.5GHz 10.54 TFLOPS (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: ASUS PRIME B460 PLUS, Socket-LGA1200 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W / RAM A1, A2, B1 & B2: DDR4-2666MHz CL13-15-15-15-35-1T "Samsung 8Gbit C-Die" (4x8GB) / Operating System: Windows 10 Home / Sound: Zombee Z300 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Seagate® Barracuda 2TB HDD / Storage 4: Seagate® Desktop 2TB SSHD / Storage 5: Crucial P1 1000GB M.2 SSD/ Storage 6: Western Digital WD7500BPKX 2.5" HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter (Qualcomm Atheros)

Vishera-X8-9370 | R20 score MC: 1476cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 (Base: @4.4GHz | Turbo: @4.7GHz) Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-1866MHz CL8-10-10-28-37-2T (4x4GB) 16.38GB / Operating System 1: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter

Godavari-X4-880K | R20 score MC: 810cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 95w Thermal Solution / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 860K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: EVGA GeForce GTX 960 SuperSC 2GB (T.S.M.C. 28nm) / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-40 (4x4GB) 16.38GB / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter

Acer Aspire 7738G custom (changed CPU, GPU & Storage)
Spoiler

CPU: Intel Core 2 Duo P8600, 2-cores, 2-threads, 2.4GHz, 3MB cache (Intel 45nm) / GPU: ATi Radeon HD 4570 515MB DDR2 (T.S.M.C. 55nm) / RAM: DDR2-1066MHz CL7-7-7-20-1T (2x2GB) / Operating System: Windows 10 Home / Storage: Crucial BX500 480GB 3D NAND SATA 2.5" SSD

Complete portable device SoC history:

Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
Apple A9 - Apple iPhone 6s Plus
HiSilicon Kirin 810 (T.S.M.C. 7nm) - Huawei P40 Lite / Huawei nova 7i
Mediatek MT2601 (T.S.M.C 28nm) - TicWatch E
Mediatek MT6580 (T.S.M.C 28nm) - TECNO Spark 2 (1GB RAM)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (orange)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (yellow)
Mediatek MT6735 (T.S.M.C 28nm) - HMD Nokia 3 Dual SIM
Mediatek MT6737 (T.S.M.C 28nm) - Cherry Mobile Flare S6
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (blue)
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (gold)
Mediatek MT6750 (T.S.M.C 28nm) - honor 6C Pro / honor V9 Play
Mediatek MT6765 (T.S.M.C 12nm) - TECNO Pouvoir 3 Plus
Mediatek MT6797D (T.S.M.C 20nm) - my|phone Brown Tab 1
Qualcomm MSM8926 (T.S.M.C. 28nm) - Microsoft Lumia 640 LTE
Qualcomm MSM8974AA (T.S.M.C. 28nm) - Blackberry Passport
Qualcomm SDM710 (Samsung 10nm) - Oppo Realme 3 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

I am sticking to my sub 1080p GTX 970 and Vega 56 for 1080p until next after unless AMD offers something very competitive in terms of price.

Zen-III-X12-5900X (Gaming PC)

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Ryzen 5 3600, 6-cores, 12-threads, 4.2/4.2GHz, 35,3MB cache (T.S.M.C. 7nm FinFET) / CPU: AMD Ryzen 9 5900X(ECO mode), 12-cores, 24-threads, 4.5/4.8GHz, 70.5MB cache (T.S.M.C. 7nm FinFET) / Display: HP 24" L2445w (64Hz OC) 1920x1200 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: ASUS Radeon RX 6600 XT DUAL OC RDNA2 32CUs @2.6GHz 10.6 TFLOPS (T.S.M.C. 7nm FinFET) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: ASRock B450M Pro4, Socket-AM4 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W / RAM A2 & B2: DDR4-3600MHz CL16-18-8-19-37-1T "SK Hynix 8Gbit CJR" (2x16GB) / Operating System: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Western Digital My Passport 2.5" 2TB HDD / Storage 4: Western Digital Elements Desktop 2TB HDD / Storage 5: Kingston A2000 1TB M.2 NVME SSD / Wi-fi & Bluetooth: ASUS PCE-AC55BT Wireless Adapter (Intel)

 Lake-V-X6-10600 (Gaming PC)

R23 score MC: 9190pts | R23 score SC: 1302pts

R20 score MC: 3529cb | R20 score SC: 506cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: Intel Core i5-10600(ASUS Performance Enhancement), 6-cores, 12-threads, 4.4/4.8GHz, 13,7MB cache (Intel 14nm++ FinFET) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1.5GHz 10.54 TFLOPS (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: ASUS PRIME B460 PLUS, Socket-LGA1200 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W / RAM A1, A2, B1 & B2: DDR4-2666MHz CL13-15-15-15-35-1T "Samsung 8Gbit C-Die" (4x8GB) / Operating System: Windows 10 Home / Sound: Zombee Z300 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Seagate® Barracuda 2TB HDD / Storage 4: Seagate® Desktop 2TB SSHD / Storage 5: Crucial P1 1000GB M.2 SSD/ Storage 6: Western Digital WD7500BPKX 2.5" HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter (Qualcomm Atheros)

Vishera-X8-9370 | R20 score MC: 1476cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 (Base: @4.4GHz | Turbo: @4.7GHz) Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-1866MHz CL8-10-10-28-37-2T (4x4GB) 16.38GB / Operating System 1: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter

Godavari-X4-880K | R20 score MC: 810cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 95w Thermal Solution / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 860K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: EVGA GeForce GTX 960 SuperSC 2GB (T.S.M.C. 28nm) / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-40 (4x4GB) 16.38GB / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter

Acer Aspire 7738G custom (changed CPU, GPU & Storage)
Spoiler

CPU: Intel Core 2 Duo P8600, 2-cores, 2-threads, 2.4GHz, 3MB cache (Intel 45nm) / GPU: ATi Radeon HD 4570 515MB DDR2 (T.S.M.C. 55nm) / RAM: DDR2-1066MHz CL7-7-7-20-1T (2x2GB) / Operating System: Windows 10 Home / Storage: Crucial BX500 480GB 3D NAND SATA 2.5" SSD

Complete portable device SoC history:

Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
Apple A9 - Apple iPhone 6s Plus
HiSilicon Kirin 810 (T.S.M.C. 7nm) - Huawei P40 Lite / Huawei nova 7i
Mediatek MT2601 (T.S.M.C 28nm) - TicWatch E
Mediatek MT6580 (T.S.M.C 28nm) - TECNO Spark 2 (1GB RAM)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (orange)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (yellow)
Mediatek MT6735 (T.S.M.C 28nm) - HMD Nokia 3 Dual SIM
Mediatek MT6737 (T.S.M.C 28nm) - Cherry Mobile Flare S6
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (blue)
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (gold)
Mediatek MT6750 (T.S.M.C 28nm) - honor 6C Pro / honor V9 Play
Mediatek MT6765 (T.S.M.C 12nm) - TECNO Pouvoir 3 Plus
Mediatek MT6797D (T.S.M.C 20nm) - my|phone Brown Tab 1
Qualcomm MSM8926 (T.S.M.C. 28nm) - Microsoft Lumia 640 LTE
Qualcomm MSM8974AA (T.S.M.C. 28nm) - Blackberry Passport
Qualcomm SDM710 (Samsung 10nm) - Oppo Realme 3 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, realshmalex said:

What I don’t get is why 3080 is 10GB...??

im waiting for 3080(ti) 12~16GB version

Because why sell the same card once when you can sell it twice,to the same person, in the time span of 1-2 years... 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, realshmalex said:

What I don’t get is why 3080 is 10GB...??

im waiting for 3080(ti) 12~16GB version

Unless you have a specific reason why you need 12+ GB of VRAM, 10GB will be plenty for 4k gaming.  Benchmarks will come out soon but I would guess that by the time 10 GB isn't enough, we'd have already dropped the resolution to 1440p anyway.

 

The other thing is that when using DLSS, you only need to render in 1080p, 1440p, or another resolution before AI-upscaling to 4k.  That may actually mean that you only need the equivalent VRAM needed to render 1080p instead of 4k, thus making higher VRAM totals less important.

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone is trying to predict what various things will need.  No one is sure though.  The impression I am getting is Nvidia doesn’t really know either, so they made a bunch of guesses. Some will turn out to be right and others will turn out to be wrong.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/12/2020 at 3:59 AM, Ankerson said:

 

 

Saw more than a few tests running Doom E on 2080Ti's and yeah it was about the same or higher than what Nvidia showed for the 3080.

 

And that was with the settings maxxed out at 4K.

 

So like I have been saying all along it's really going to be the 3080 that will compete with the 2080Ti, not the 3070 like people seem to think.

 

That is actually more believable too.

 

But we will all see soon enough.

 

But we do know RT and DLSS cuts down the frame rates of the 2080Ti quite a bit.... So I would hope the 3080 would beat it with RT and DLSS on... I mean with all of the improvements they said they made with the Gen 2 RT...

 

I do think the 3090 is going to be a Monster though..

Can verify this, on 4k Ultra nightmare I am getting similar performance with my 2080ti (albeit very OC'd).

 

Also why I just hate people using doom so much for specific performance metrics.  I am fine with it as a data point and other games being listed, but it as a sole data point is almost useless since it's so well optimized and basically always an outlier (with FPS wayyyy on the higher side compared to standard AAA games).

 

Those bastards just made that game look so good with so little frame penalty.  Good on em, but such a bad representation of performance IMO

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Zberg said:

Can verify this, on 4k Ultra nightmare I am getting similar performance with my 2080ti (albeit very OC'd).

 

Also why I just hate people using doom so much for specific performance metrics.  I am fine with it as a data point and other games being listed, but it as a sole data point is almost useless since it's so well optimized and basically always an outlier (with FPS wayyyy on the higher side compared to standard AAA games).

 

Those bastards just made that game look so good with so little frame penalty.  Good on em, but such a bad representation of performance IMO

 

Yeah, they just chose the wrong game.... LOL 🤣

 

There are people out here with 2080Ti's and have Doom E.... I don't have that game personally or I would run it and test it too.

 

I do have SOTTR, Metro Exodus, Far Cry 5 and Far Cry New Dawn....

 

The really funny thing is I had to reset my settings in the games that were tested because I play at higher settings than they tested at... LOL 🤣

 

 

i9 9900K @ 5.0 GHz, NH D15, 32 GB DDR4 3200 GSKILL Trident Z RGB, AORUS Z390 MASTER, EVGA RTX 3080 FTW3 Ultra, Samsung 970 EVO Plus 500GB, Samsung 860 EVO 1TB, Samsung 860 EVO 500GB, ASUS ROG Swift PG279Q 27", Steel Series APEX PRO, Logitech Gaming Pro Mouse, CM Master Case 5, Corsair AXI 1600W Titanium. 

 

i7 8086K, AORUS Z370 Gaming 5, 16GB GSKILL RJV DDR4 3200, EVGA 2080TI FTW3 Ultra, Samsung 970 EVO 250GB, (2)SAMSUNG 860 EVO 500 GB, Acer Predator XB1 XB271HU, Corsair HXI 850W.

 

i7 8700K, AORUS Z370 Ultra Gaming, 16GB DDR4 3000, EVGA 1080Ti FTW3 Ultra, Samsung 960 EVO 250GB, Corsair HX 850W.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, realshmalex said:

What I don’t get is why 3080 is 10GB...??

im waiting for 3080(ti) 12~16GB version

If i was looking at that specific card 10gb wouldn't deter me.  

 

I was messing with settings in RDR2 the other day.  4k, everything maxed.  Triple buffering.  VRAM usage still doesn't get to 8gb (it said 6xxx some, close to 7gb).  10 is certainly enough.  At least for the moment 10 gb should be just fine.

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Nena Trinity said:

I think the 2080Ti is a good example how that can fail... :I

i have not said it's the best idea in the world either LOL... XD

Personally i'll wait for reviews and i'll even wait for MSI Gaming X TRIO RTX 3080...maybe even 3080 Super the way it's going now the 1080ti still perform very well for what i do.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

So basically if I want to play games until the 4000 series comes out, at 4k, on ultra, with ray tracing and DLSS on, on a 144 Hz panel, I should get a RTX 3090 over a RTX 3080?Given that a 8k panel is out of the question for MOST people. Are there any other use cases for a gamer? A RTX 3080 might allow me to hit high 60s while a RTX 3080 can get me closer to 144 Hz?

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Yoshi Moshi said:

So basically if I want to play games until the 4000 series comes out, at 4k, on ultra, with ray tracing and DLSS on, on a 144 Hz panel, I should get a RTX 3090 over a RTX 3080?Given that a 8k panel is out of the question for MOST people. Are there any other use cases for a gamer? A RTX 3080 might allow me to hit high 60s while a RTX 3080 can get me closer to 144 Hz?

 

If that's what you are planning then yeah I believe the 3090 is your ticket....

 

If I personally get any of the 3000 series it would be the 3090, and the EVGA FTW3 Ultra.... But I have a 2080Ti now so..... And I don't game at 4K..

i9 9900K @ 5.0 GHz, NH D15, 32 GB DDR4 3200 GSKILL Trident Z RGB, AORUS Z390 MASTER, EVGA RTX 3080 FTW3 Ultra, Samsung 970 EVO Plus 500GB, Samsung 860 EVO 1TB, Samsung 860 EVO 500GB, ASUS ROG Swift PG279Q 27", Steel Series APEX PRO, Logitech Gaming Pro Mouse, CM Master Case 5, Corsair AXI 1600W Titanium. 

 

i7 8086K, AORUS Z370 Gaming 5, 16GB GSKILL RJV DDR4 3200, EVGA 2080TI FTW3 Ultra, Samsung 970 EVO 250GB, (2)SAMSUNG 860 EVO 500 GB, Acer Predator XB1 XB271HU, Corsair HXI 850W.

 

i7 8700K, AORUS Z370 Ultra Gaming, 16GB DDR4 3000, EVGA 1080Ti FTW3 Ultra, Samsung 960 EVO 250GB, Corsair HX 850W.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ankerson said:

 

If that's what you are planning then yeah I believe the 3090 is your ticket....

 

If I personally get any of the 3000 series it would be the 3090, and the EVGA FTW3 Ultra.... But I have a 2080Ti now so..... And I don't game at 4K..

why would you get a 3090 without gaming at 4k?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Yoshi Moshi said:

why would you get a 3090 without gaming at 4k?

 

Because I play everything with maxxed out settings. 

 

Didn't say I was going to get one, I said if I got one that would be it.

i9 9900K @ 5.0 GHz, NH D15, 32 GB DDR4 3200 GSKILL Trident Z RGB, AORUS Z390 MASTER, EVGA RTX 3080 FTW3 Ultra, Samsung 970 EVO Plus 500GB, Samsung 860 EVO 1TB, Samsung 860 EVO 500GB, ASUS ROG Swift PG279Q 27", Steel Series APEX PRO, Logitech Gaming Pro Mouse, CM Master Case 5, Corsair AXI 1600W Titanium. 

 

i7 8086K, AORUS Z370 Gaming 5, 16GB GSKILL RJV DDR4 3200, EVGA 2080TI FTW3 Ultra, Samsung 970 EVO 250GB, (2)SAMSUNG 860 EVO 500 GB, Acer Predator XB1 XB271HU, Corsair HXI 850W.

 

i7 8700K, AORUS Z370 Ultra Gaming, 16GB DDR4 3000, EVGA 1080Ti FTW3 Ultra, Samsung 960 EVO 250GB, Corsair HX 850W.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Yoshi Moshi said:

So basically if I want to play games until the 4000 series comes out, at 4k, on ultra, with ray tracing and DLSS on, on a 144 Hz panel, I should get a RTX 3090 over a RTX 3080?Given that a 8k panel is out of the question for MOST people. Are there any other use cases for a gamer? A RTX 3080 might allow me to hit high 60s while a RTX 3080 can get me closer to 144 Hz?

Really cant stress the need to wait for reviews and see how good things really are.  Other than that, we are kind of extrapolating from what Nvidia has thrown us.

 

But as of right now, I would guess based on what you want if you really want 144hz ultra 4k I cant imagine any card but the 3090 will do that.  The 2080ti certainly cant do it, and it's honestly a big ask.  I would be surprised if anything but the 3090 can do 4k high refresh, ultra, which is an extreme luxury. 

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

this thread is pointless. instead of trying to estimate whether a 3090 is overkill based on some stupid marketing video nvidia felt like releasing wait for benchmarks. if the 3090 benches much more than 144 fps in 4k on AAA titles and the 3080 reaches 144 then you can show that to your friend and tell him that there is no point to get it but as of now this is all just pointless.

you should also not tell your friend what to do with his money unless he is poor and he is being an idiot spending his savings on the card but then he should also not be gaming at 4k. the vast majority of buyers that go for the most expensive card are usually morons who dont know shit about hardware. back in the day every single titan owner i knew was some scandinavian dipshit who used his unemployed money to buy titans which costed a lot more than the gaming cards but gave the same gaming performance just because theyre dumb and thought it sounds cool.

nvidia knows this.

Link to comment
Share on other sites

Link to post
Share on other sites

The 16th can't come soon enough

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

So I have come to terms that it's time to upgrade to a 4k panel. The new xbox and playstation are 4k machines, some movies are now 4k. The time will come when to upgrade to 4k, and I think the time is now.

 

Looking at the confirmed legit doom eternal benchmark, even the 3080 and 2080 Ti can hit more than 60 FPS at 4k. If the alleged leaks of tomb raider and control from the Chinese leaker are true, the 3080 can hit well over 60 FPS on ultra with ray tracing on. So would it make sense to get a 4k 144 Hz panel regardless if your getting a 3080 or a 3090 because you will most likely hit over 60 FPS with both? Chinese leaks for Control and Tomb Raider came no where close to 144 FPS on the 3080, but still greater than 60 FPS. Even with a 20 percent increase on the FPS with those games if you had a 3090, you would not max out a 144 Hz 4k panel. This is all if the leaks are true.

 

I sold my 2080 Ti before the announcement and am on my older GTX 745, so I'll be getting at least one of these cards regardless of the reviews, the question is just which.

 

If the 3080 can get me 60 FPS or higher on 4k on ultra, is more than double the price of the 3080 to get a 3090 to get 20 percent (or more) FPS than the 3080 worth it? I think it's a personal choice only the buyer can say. I personally don't mind 60 FPS, anything more is a luxury. I gamed on console for a long time at much less than 60 FPS, so I'm used to less. So it might not be worth it to me.

 

If Nvidia had SLI on the 3080, I know they don't, I could get two 3080 and put them in SLI for the cost of a 3090 haha. They removed it for a reason I guess.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×