Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
BiG StroOnZ

NVIDIA GeForce 3070/3080/3080 Ti (Ampere): RTX Has No Perf Hit & x80 Ti Card 50% Faster in 4K! (Update 6 ~ Specs / Overview / Details)

Recommended Posts

Posted · Original PosterOP
Quote

684amperl.jpg.f5ba300f2cc34459266c6f70ebad142c.jpg

 

NVIDIA's next-gen Ampere: huge ray tracing performance upgrades, more VRAM, less power, and also on 7nm. The news is coming from the purported fact that NVIDIA has been discussing Ampere and the next-gen GeForce RTX 3000 series cards with AIB partners; there are a very few, but juicy tidbits here, that if true, would be great to see.

 

NVIDIA's new GeForce RTX 3000 series will feature "massive" performance improvements for ray tracing, with Ampere's iteration of ray tracing being faster than Turing, and more power efficient. Another area where Ampere will receive some big upgrades is Rasterization, which blended in with the advancements in ray tracing on Ampere, we will have enough power to render next-gen graphics and next-gen worlds in games.

The new Ampere-based GeForce RTX 3000 series cards will reportedly offer more VRAM on all cards, so we could see:

 

  • NVIDIA GeForce RTX 3070 - 12GB
  • NVIDIA GeForce RTX 3080 - 12GB
  • NVIDIA GeForce RTX 3080 Ti - 16GB

When it comes to clock speeds we should also expect Ampere-based cards to have between 100-200MHz GPU clock speed uplifts versus Turing, all while being more power efficient thanks to the new Ampere GPU architecture and the fact that's on 7nm. NVIDIA will be using the 7nm EUV process node for Ampere, something that will deliver NVIDIA some stellar power savings -- but this is where it's interesting. NVIDIA has had really great power efficiency since Pascal, and went right into Turing with improved power efficiency and that's on larger 14nm and 12nm nodes. 7nm gives NVIDIA much more room to play in, joined with the Ampere GPU architecture. Rumor has it the new Ampere GPUs can run at under 1.0v which is absolutely huge and could mean for the most power efficient GeForce cards yet. If NVIDIA removed all stops in the pursuit of driving higher voltages on their GPUs, we could see 400-500MHz GPU clock improvements on custom cards.

Finally, NVIDIA is reportedly set to offer its next-gen GeForce RTX 3000 series at cheaper prices than the current-gen GeForce RTX 2000 series.

 

Source: https://www.tweaktown.com/news/68455/nvidia-geforce-rtx-3080-ti-more-vram-way-faster-cheaper/index.html

 

It would be befitting to remain hesitant on believing all/any of the speculative details within this leak/rumor. As obviously until we get closer to a time period where we would actually expect a new NVIDIA GPU Arch to release, many leaks and rumors leading up to that point may be somewhat inaccurate; containing half-truths or even complete disinformation. Nonetheless, not much of the contents above are too far off from past rumors, and Q3/Q4 of 2020 seems like an appropriate release window for Ampere. Especially when considering the introduction of Intel into the dGPU space coming soon. From an optimistic viewpoint, it seems that NVIDIA has checked all the right boxes with Ampere. Therefore, I'm looking forward to seeing the first leaked benchmarks in the coming months. 

 

Small update to this story (Geforce RTX 3000 Ampere arriving to Data centers in March and to Consumers June 2020):

Quote

NVIDIA is slowly, but steadily preparing the launch of their Ampere generation GPUs; allegedly starting March 2020, the first models will be out for data-centers. Geforce 3000-series for desktop PCs would then appear in June. This is reported by the Chinese website HKEPC, citing Chris Caso, an analyst with Raymond James.

 

This means around Computex, in June 2020, the first consumer variants with Ampere chips would see the light of day. The launch would involve the high-end cards first and slower models thereafter. Ampere is said to be manufactured at Samsung on 7nm - using its Extreme Ultraviolet (EUV) method 7nm+. In comparison to Turing, the clock rates could be increased by 200 to 300 MHz. 7nm makes transistors smaller, and that allows for more transistors to be placed on a chip. It is expected that Ampere will offer significantly more Raytracing cores than Turing did. 

 

Source 2: https://www.guru3d.com/news_story/geforce_rtx_3000_ampere_data_center_marchconsumers_june_2020.html

Source 3: https://www.techradar.com/news/nvidia-rtx-3080-graphics-card-could-be-powering-gaming-pcs-in-june-2020

 

Another compelling update to this story - conceivable specifications for the Ampere GPUs (from our German friends @ 3DCenter.org):

Quote

amperespecsy.thumb.jpg.a55e42094f9029286ceaaf8d64cc3b59.jpg

 

amperespecsy23.thumb.jpg.ef9c3c64dcb63d471eb4207bd908b63d.jpg

 

SE (Shader-Einheiten) translates to Shader Unit (in this case an NVIDIA CUDA Core). It appears they are suggesting that both the Tesla and Titan cards will be based on the same silicon; as well as them having HBM2, which is very intriguing. They don't specify memory setups on the other cards (consumer), but I would imagine sticking to GDDR6 is still best for these applications. Also, looking at the proposed CUDA Core counts, for say, the RTX 3080 Ti, that chip would be around 30-50% faster than an RTX 2080 Ti / RTX Titan. As for the purported Ampere Titan (GA100), that would be a colossal 70-80% faster than an RTX 2080 Ti / RTX Titan (in best case scenarios).

 

Source 4

Source 5 

 

I was going to make a separate thread for this news story, but I think it's fitting to add it here. Basically, since the thread is already established, and constructive conversation related both to the new topic, and Ampere will be able to continue (without much obstruction to the OP). This news story, it isn't directly related to Ampere, but if correct; could definitely affect the release dates of Ampere - moving it closer to around Q4 2020, maybe even Q1 of 2021 (which changes a lot regarding past Ampere rumors).


But, the news is, that there is hubbub suggesting NVIDIA is readying a GeForce RTX 2080 Ti SUPER:

Spoiler

 

 

Quote

untitled-1.png.1dcd94f88c6a72423cd760c0cd49f227.png

 

1571854488_untitled-1(1).png.68139881ef03ab95fff0c2f6a3fc71ab.png

 

NVIDIA could launch a "GeForce RTX 2080 Ti Super" after all, if a tweet from kopite7kimi, an enthusiast with a fairly high hit-rate with NVIDIA rumors is to be believed. The purported SKU could be faster than the RTX 2080 Ti, and yet be somehow differentiated from the TITAN RTX. For starters, NVIDIA could enable all 4,608 CUDA cores, 576 tensor cores, and 72 RT cores, along with 288 TMUs and 96 ROPs. Compared to the current RTX 2080 Ti, the Super could get faster 16 Gbps GDDR6 memory.

It's possible that NVIDIA won't change the 352-bit memory bus width or 11 GB memory amount, as those would be the only things stopping the card from cannibalizing the TITAN RTX, which has the chip's full 384-bit memory bus width, and 24 GB of memory. Interestingly, at 16 Gbps with a 352-bit memory bus width, the RTX 2080 Ti Super would have 704 GB/s of memory bandwidth, which is higher than the 672 GB/s of the TITAN RTX, with its 14 Gbps memory clock.

 

 

Thought this was a decent update regarding this topic, worthy of a bump for this thread (but not worthy of an entirely new one):

 

Quote

Today, according to the latest report made by Taipei Times, NVIDIA's next-generation of graphics cards based on "Ampere" architecture is rumored to have as much as 50% performance uplift compared to the previous generations of Turing GPUs, while using having half the power consumption. NVIDIA is to launch its new Ampere-based GPU in the second half of this year.

 

It is not exactly clear as to how they came to that math, but we'll happily take it for granted as an expectation. 7-nanometer technology, which would lead to a 50 percent increase in graphics performance while halving power consumption, they said. Perhaps they figure that slicing the production fabrication in half can double up transistors and this here is where that 50% perf is coming from. However, performance should increase even further because Ampere will bring new architecture as well. Combining a new manufacturing node and new microarchitecture, Ampere will reduce power consumption in half, making for a very efficient GPU solution. We still don't know if the performance will increase mostly for ray tracing applications, or will NVIDIA put the focus on general graphics performance.

 

Source 9) http://www.taipeitimes.com/News/biz/archives/2020/01/02/2003728557

Source 10) https://www.techpowerup.com/262592/nvidias-next-generation-ampere-gpus-to-be-50-faster-than-turing-at-half-the-power

Source 11) https://www.guru3d.com/news-story/next-generation-nvidia-ampere-reportedly-to-offer-50-more-perf-at-half-the-power.html

 

As far as my opinion on the matter: there were other outlets claiming similar or higher performance increases as here. Therefore, it isn't out of the realm of possibility; however, it seemed at the time many conflicting opinions appeared based on those original performance claims (Mainly, skepticism, as in: "there's no way anyone can know these suggested performance numbers this early". That was basically in the beginning of November 2019). 

 

 

Fifth update on Ampere:

 

Quote

 

Spoiler

588668583_ampeecopy.thumb.jpg.56e966b5eb08c9178bd45bc004267d7d.jpg

 

 

 

Spoiler

ampereee.thumb.jpg.f012c09a2b6d9b61ac6d6e0d920b5cb7.jpg

 

 

Spoiler

aBiHyo5cnREvuqV5.jpg.8b726b1d51da230418c43816a54456cd.jpg

 

Spoiler

TkrJGx4zxDHGhG6X.jpg.608db05127daeb04acb4344f31e0e764.jpg

 

 

Alleged specifications of the GeForce RTX 3070 and RTX 3080 have surfaced. Of course, you'll need to take a ton disclaimers in mind, and huge grains of salt. But history has proven over and over again, that there often is validity (at least to some degree) to be found in these leaks. So here we go:

 

For starters the two dies which have appeared have codenames like GA103 and GA104, standing for RTX 3080 and RTX 3070 respectively. Perhaps the biggest surprise is the Streaming Multiprocessor (SM) count. The smaller GA104 die has as much as 48 SMs, resulting in 3072 CUDA cores, while the bigger, oddly named, GA103 die has as much as 60 SMs that result in 3840 CUDA cores in total. These improvements in SM count should result in a notable performance increase across the board. Alongside the increase in SM count, there is also a new memory bus width. The smaller GA104 die that should end up in RTX 3070 uses a 256-bit memory bus allowing for 8/16 GB of GDDR6 memory, while its bigger brother, the GA103, has a 320-bit wide bus that allows the card to be configured with either 10 or 20 GB of GDDR6 memory. 

 

The original source also shared die diagrams for GA103 and GA104, they look professional but they are not as detailed as Turing diagrams, hence we put a strong doubt on their credibility. 

 

Rumors are that at GDC in March we'll see the first announcements on Ampere architecture (if it'll be called Ampere). 

 

Source 12: https://videocardz.com/newz/rumor-first-nvidia-ampere-geforce-rtx-3080-and-rtx-3070-specs-surface

Source 13: https://www.guru3d.com/news-story/rumor-nvidia-ampere-geforce-rtx-3070-and-rtx-3080-specs-surface.html 

Source 14: https://www.techpowerup.com/263128/rumor-nvidias-next-generation-geforce-rtx-3080-and-rtx-3070-ampere-graphics-cards-detailed

 

From the "Moore's Law Is Dead" YT Channel: 

 

Spoiler

 

 

 

 

Quote

 

force-has-no-perf-hit-with-rtx-on_full.thumb.png.8005225e31783ca03a3973b0fbfbef99.png

 

-next-gen-geforce-has-no-perf-hit-with-rtx-on_full.thumb.png.dfbd5e7f45f14e6aeb2d82cc5727226d.png

 

In a new video from YouTube channel Moore's Law is Dead, according to "exclusive insider info" secured by Tom -- NVIDIA's new Ampere cards are not just a die shrink of Turing with more RT Cores. It is not just the "Pascal version of Turing" and instead Ampere is a "multipurpose architecture".

 

One of the more exciting parts of the new Ampere rumors is that ray tracing performance is significantly better than Turing -- so much so that it reportedly offers 4x better performance per tier. This means a GeForce RTX 3060 will offer the same ray tracing performance as the flagship GeForce RTX 2080 Ti -- if not better.

 

The rumors do clarify that Turing will not age well when Ampere is here, with Tom reporting "Turing doing RT will be like Kepler doing DX12". 

Quote

 

72449_10_geforce-rtx-3080-ti-up-50-faster-2080-4k-gaming.thumb.jpg.5e6bbcfc5cff314c1e2e788aba4289b6.jpg

 

72449_11_geforce-rtx-3080-ti-up-50-faster-2080-4k-gaming.thumb.jpg.d9ccefee61bf70f077c6470728c4dfc8.jpg

 

These new rumored specs on GA102 have it packing 5376 CUDA cores on the Ampere architecture, 10% more IPC than Turing, and on the 7nm node that lets GPU clocks scale much higher to 2.2GHz and beyond. The lower-end Ampere GPUs will reach the dizzying heights of 2.5GHz.

 

But the memory specs on the GeForce RTX 3080 Ti have me enthused, with NVIDIA using 18Gbps GDDR6 memory which absolutely destroys with 863GB/sec of memory bandwidth. This is a 40% increase over the GeForce RTX 2080 Ti, and will see the GeForce RTX 3080 Ti being 40% faster in 4K gaming over unoptimized games, and up to 50% faster in 4K gaming in optimized games. Wow. Just, wow.

 

NVIDIA will reportedly be moving over to the new PCIe 4.0 standard, while the cooler will look "similar" to the RTX 20-series Founders Edition cards but it will have an upgraded triple-fan cooler. The design has been "simplified" with "less screws on the back of the card".

 

Source 15: https://www.tweaktown.com/news/72400/nvidia-ampere-rumor-next-gen-geforce-has-no-perf-hit-with-rtx-on/index.html

Source 16: https://www.tweaktown.com/news/72449/geforce-rtx-3080-ti-is-up-to-50-faster-than-2080-in-4k-gaming/index.html

 

Quote

translate3dcent.thumb.jpg.33863c6d85775f480f8174508123c2d0.jpg

 

Source 17: https://www.3dcenter.org/news/hardware-und-nachrichten-links-des-11-mai-2020


                                                                                                      .:. Taking us further and further from entropy [ .:.

Spoiler

                             How to free up space on your SSD                              

                     

Spoiler

                                                                                Kymatica Revision 3.5

 

CPU: Intel Core i7-2600k @ 4.3GHz Motherboard: ASRock Z68 Extreme4 Gen3 GPU: Gigabyte GeForce GTX 1660 Ti ~ TU116-400-A1 ~ (OC 6G 2x Windforce) Memory: G.Skill Ripjaws X Series 16GB @ 2133MHz @ 9-10-11-28 SSD: Crucial M500 240GB (OS/Programs) HDD1: WD 1TB Blue (Games/Storage/Media) HDD2: Seagate Barracuda 7.2K 500GB (Backup) HDD3: WD Caviar 7.2K 400GB (Backup) HDD4: WD Elements 4TB External WDBWLG0040HBK-NESN (Backup/Additional Storage)  CPU Cooling: Corsair Hydro Series H100 in Pull (w/ 2x Delta FFB1212EH 120mm) Case Fans: Noctua NF F12 industrialPPC-2000 (x3 120mm) PSU: Seasonic X-Series X-1050 1050W Case: Cooler Master HAF 922 Monitor: Samsung C27F396 Curved 27-Inch Freesync Monitor Keyboard: Cooler Master Storm Trigger-Z (Cherry MX Brown Switches) Mouse: Roccat Kone XTD / Logitech MX518 Mousepad: Corsair MM350 Premium Audio: Logitech X 2.1 Speaker System Headset: Corsair VOID Stereo Gaming Headset (w/ Sennheiser 3D G4ME 7.1 Surround Amplifier) OS: Windows 10 Pro (Version 1909)

                                                                                                       

Link to post
Share on other sites
Quote

The news is coming from the purported fact that NVIDIA has been discussing Ampere and the next-gen GeForce RTX 3000 series cards with AIB partners, and Wccftech has posted up these rumors. We don't know any concrete details on the Ampere GPU itself, which is something NVIDIA will deep dive into once it's announced

So, this is ENTIRELY rumor, the sources are "itself" and WCCFTech, and they flat out admit that they know truly nothing about the GPU's...

 

?


"Put as much effort into your question as you'd expect someone to give in an answer"- @Princess Luna

Make sure to Quote posts or tag the person with @[username] so they know you responded to them!

Purple Build Post ---  Blue Build Post --- Blue Build Post 2018 --- RGB Build Post 2019 --- Project ITNOS --- P600S VS Define R6/S2

CPU i7-4790k    Motherboard Gigabyte Z97N-WIFI    RAM G.Skill Sniper DDR3 1866mhz    GPU EVGA GTX1080Ti FTW3    Case Corsair 380T   

Storage 1x Samsung EVO 250GB, WD Black 3TB, WD Black 5TB    PSU Corsair CX550M    Cooling Cryorig H7 with NF-A12x25

Link to post
Share on other sites

let's see if the flagship will break the $2000 price level for the reference model at launch


CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: 1TB HP EX920 PCIe x4 M.2 SSD + 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172), 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to post
Share on other sites

It'll be interesting to see what the performance improvements are like.

I don't think we'll really see drops in power consumption. They're already decent enough. I imagine they'll just increase the performance to lay waste to AMD.

 

9 minutes ago, Leviathan- said:

And the cards will be cheaper? Hah.

More yield from switching to 7nm? They've got to compete with AMD.

8 minutes ago, emosun said:

people are so excited about raytracing and I still don't get why dx10 is better than dx9

I'd just be excited to get solid 1440p 144hz performance from a mid range card...

Ray tracing will be cool. In like 4 years.

2 minutes ago, HarryNyquist said:

If they are I'm gonna be mad AF cuz I literally just bought a 2080 super to replace my dead 1080Ti

Then you should get out of tech.


Current PC:

Spoiler

*WORK IN PROGRESS*

 

Mothballed PC:

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to post
Share on other sites

16GB AMD VRAM competitor card when?


PLEASE QUOTE ME IF YOU ARE REPLYING TO ME
LinusWare Dev | NotCPUCores Dev

Desktop Build: Ryzen 7 1800X @ 4.0GHz, AsRock Fatal1ty X370 Professional Gaming, 32GB Corsair DDR4 @ 3000MHz, RX480 8GB OC, Benq XL2730 1440p 144Hz FS

Retro Build: Intel Pentium III @ 500 MHz, Dell Optiplex G1 Full AT Tower, 768MB SDRAM @ 133MHz, Integrated Graphics, Generic 1024x768 60Hz Monitor


 

Link to post
Share on other sites
Just now, rcmaehl said:

16GB AMD VRAM competitor card when?

Radeon 7?

 

Any ways this "leak" is as useless as AdoredTV's delusions on any AMD release.


Workstation Rig:
CPU:  Intel Core i9 9900K @5.0ghz  |~| Cooling: beQuiet! Dark Rock 4 |~|  MOBO: Asus Z390M ROG Maximus XI GENE |~| RAM: 32gb 3333mhz CL15 G.Skill Trident Z RGB |~| GPU: nVidia TITAN V  |~| PSU: beQuiet! Dark Power Pro 11 80Plus Platinum  |~| Boot: Intel 660p 2TB NVMe |~| Storage: 2X4TB HDD 7200rpm Seagate Iron Wolf + 2X2TB SSD SanDisk Ultra |~| Case: Cooler Master Case Pro 3 |~| Display: Acer Predator X34 3440x1440p100hz |~| OS: Windows 10 Pro.
 
Personal Use Rig:
CPU: Intel Core i9 9900 @4.75ghz |~| Cooling: beQuiet! Shadow Rock Slim |~| MOBO: Gigabyte Z390M Gaming mATX|~| RAM: 16gb DDR4 3400mhzCL15 Viper Steel |~| GPU: nVidia Founders Edition RTX 2080 Ti |~| PSU: beQuiet! Straight Power 11 80Plus Gold  |~|  Boot:  Intel 660p 2TB NVMe |~| Storage: 2x2TB SanDisk SSD Ultra 3D |~| Case: Cooler Master Case Pro 3 |~| Display: Viotek GN34CB 3440x1440p100hz |~| OS: Windows 10 Pro.


HTPC / "Console of the house":

CPU: Intel Core i7 8700 @4.45ghz |~| Cooling: Cooler Master Hyper 212X |~| MOBO: Gigabyte Z370M D3H mATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: nVidia Founders Edition GTX 1080 Ti |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.
Link to post
Share on other sites
24 minutes ago, TVwazhere said:

So, this is ENTIRELY rumor, the sources are "itself" and WCCFTech, and they flat out admit that they know truly nothing about the GPU's...

 

?

Also they post this as if this thing is coming out any day now lol.  This is just some specs and speculation which Im not saying is not true but Im not gonna believe it until the time comes.  If they go by the 1080 release to 2080 it will take almost 2 years from the time the 2080 came out.  I wouldn't hold my breath ya know.  But guaranteed to be 2021, maybe earlier who am I to say.


Asus Sabertooth x79 / 4930k @ 4500 @ 1.408v / Gigabyte WF 2080 RTX / Corsair VG 64GB @ 1866 & AX1600i & H115i Pro @ 2x Noctua NF-A14 / Carbide 330r Blackout

Scarlett 2i2 Audio Interface / KRK Rokits 10" / Sennheiser HD 650 / Logitech G Pro Wireless Mouse & G915 Linear & G935 & C920 / SL 88 Grand / Cakewalk / NF-A14 Int P12 Ex
AOC 40" 4k Curved / LG 55" OLED C9 120hz / LaCie Porsche Design 2TB & 500GB / Samsung 950 Pro 500GB / 850 Pro 500GB / Crucial m4 500GB / Asus M.2 Card

Link to post
Share on other sites
14 minutes ago, HarryNyquist said:

If they are I'm gonna be mad AF cuz I literally just bought a 2080 super to replace my dead 1080Ti

I wouldn't be worried about it man. They're going to figure out some way to raise the price on it without a doubt unless AMD and Intel really land their next GPU. How did you kill that 1080TI? 

Link to post
Share on other sites
7 minutes ago, Princess Luna said:

Radeon 7?

Wow. I'm an idiot. I knew this and forgot


PLEASE QUOTE ME IF YOU ARE REPLYING TO ME
LinusWare Dev | NotCPUCores Dev

Desktop Build: Ryzen 7 1800X @ 4.0GHz, AsRock Fatal1ty X370 Professional Gaming, 32GB Corsair DDR4 @ 3000MHz, RX480 8GB OC, Benq XL2730 1440p 144Hz FS

Retro Build: Intel Pentium III @ 500 MHz, Dell Optiplex G1 Full AT Tower, 768MB SDRAM @ 133MHz, Integrated Graphics, Generic 1024x768 60Hz Monitor


 

Link to post
Share on other sites
26 minutes ago, Jurrunio said:

let's see if the flagship will break the $2000 price level for the reference model at launch

I wouldn't go that far, but I wouldn't be surprised either.  Look at the 1080Ti price then the 2080Ti which was literally 1500 bones or more at launch.


Asus Sabertooth x79 / 4930k @ 4500 @ 1.408v / Gigabyte WF 2080 RTX / Corsair VG 64GB @ 1866 & AX1600i & H115i Pro @ 2x Noctua NF-A14 / Carbide 330r Blackout

Scarlett 2i2 Audio Interface / KRK Rokits 10" / Sennheiser HD 650 / Logitech G Pro Wireless Mouse & G915 Linear & G935 & C920 / SL 88 Grand / Cakewalk / NF-A14 Int P12 Ex
AOC 40" 4k Curved / LG 55" OLED C9 120hz / LaCie Porsche Design 2TB & 500GB / Samsung 950 Pro 500GB / 850 Pro 500GB / Crucial m4 500GB / Asus M.2 Card

Link to post
Share on other sites

* Proceed to checkout my 2nd RTX2080ti *

* Sees the Ampere news *

Wait hold on ...

* Sees Q4 2020 release *

* finalize the payment for the 2080ti *

 

Though the news for overall cheaper cards gives me hope, I don't think Ampere will be ground breaking.

Link to post
Share on other sites
15 minutes ago, Turtle Rig said:

I wouldn't go that far, but I wouldn't be surprised either.  Look at the 1080Ti price then the 2080Ti which was literally 1500 bones or more at launch.

Didnt the 8800 Ultra launch at $999 like 10 ir 12 years ago? $500 more than that isnt much including inflation. I mean it's a crap ton of money for a GPU but I still want one lol.

Link to post
Share on other sites
2 minutes ago, corsairian said:

Didnt the 8800 Ultra launch at $999 like 10 ir 12 years ago? $500 more than that isnt much including inflation. I mean it's a crap ton of money for a GPU but I still want one lol.

inflation.JPG.8c36aa15fa37243f9a376f57d8bc3b81.JPG

Link to post
Share on other sites
2 minutes ago, corsairian said:

Didnt the 8800 Ultra launch at $999 like 10 ir 12 years ago? $500 more than that isnt much including inflation. I mean it's a crap ton of money for a GPU but I still want one lol.

Yes with Intel and nVidia your guaranteed to be paying whole lot more.  That is the industry.  I thinki nVidia doesn't care as much as Intel did when they dropped 1k off their upcoming 18 core 10980XE.  nVidia doesn't need to do anything drastic like this as it already has a fan base or consumer base.  There are some that will swear by Intel and nVidia and some swear by AMD for both CPU and GPU and best bang for your buck.  Sometimes people want the best bang but don't care about the extra bucks they pay.


Asus Sabertooth x79 / 4930k @ 4500 @ 1.408v / Gigabyte WF 2080 RTX / Corsair VG 64GB @ 1866 & AX1600i & H115i Pro @ 2x Noctua NF-A14 / Carbide 330r Blackout

Scarlett 2i2 Audio Interface / KRK Rokits 10" / Sennheiser HD 650 / Logitech G Pro Wireless Mouse & G915 Linear & G935 & C920 / SL 88 Grand / Cakewalk / NF-A14 Int P12 Ex
AOC 40" 4k Curved / LG 55" OLED C9 120hz / LaCie Porsche Design 2TB & 500GB / Samsung 950 Pro 500GB / 850 Pro 500GB / Crucial m4 500GB / Asus M.2 Card

Link to post
Share on other sites
9 minutes ago, OlympicAssEater said:

Bitcoin miners will destroy the msrp really quick. I hate those mf bitcoin miners ruining msrp and supply.

It's that still a thing? I thought there was over now

Link to post
Share on other sites

Depending on how much "cheaper", this could be the upgrade I've been waiting for. My 1080 has been great from day 1, but I'll see how it endures by the time the 3000 series is released.

 


2019 Build - AMD Ryzen 9 3900X | Noctua NH-D15S | Gigabyte X570 AORUS ELITE | Crucial Ballistix Sport LT 16 GB (2 x 8 GB @ CL16 3533 MHz 1.35V) | EVGA GeForce GTX 1080 GAMING ACX 3.0 | Fractal Design Design Define R6 USB-C Blackout | Samsung 840 Pro 256GB | Samsung 840 EVO 500GB | Seagate Barracuda 2TB | Dell S2716DG + Asus MX259H  | Ducky Shine 5 with Cherry MX Browns

 

2014 Build (Retired)- Intel Core i5-4690S | CM Hyper 212 EVO | MSI Z97-G45 Gaming | Corsair Vengeance Pro 16GB (2 x 8GB) DDR3 | EVGA GeForce GTX 1080 GAMING ACX 3.0 | Fractal Design Define R4 | Samsung 840 Pro 256GB | Samsung 840 EVO 500GB | Seagate Barracuda 2TB | Dell S2716DG + Asus MX259H  | Ducky Shine 5 with Cherry MX Browns

Link to post
Share on other sites

I've been planning to buy a 3000 - series NVIDIA GPU anyways once they launch, regardless or despite any rumours. I'm skipping the 2000 - series entirely, upgrading from my GTX 1080. Can't say any of these attempts to hype things up do anything for me -- I know a 3070 or 3080 will be faster than my 1080 and will have some ray-tracing stuff, so I can give that at least a try.

 

I just don't give a fuck about how many cores this or megabytes that it has. The only thing I wish to know is whether AV1 will be among the supported codecs in NVENC or NVDEC.


Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×