Jump to content

NVIDIA GeForce 3070/3080/3080 Ti (Ampere): RTX Has No Perf Hit & x80 Ti Card 50% Faster in 4K! (Update 6 ~ Specs / Overview / Details)

6 minutes ago, Reytime said:

20GB VRAM would help with 4K gaming.

 

But really this way we can keep pointing and laughing at console gamers? Even after PS5 and Xbox Series X. 

pointing and laughing that they spent 400 yrs ago and still can play new games just fine on the couch with others inhouse so easily, fyi non console gamer here but seriously

5 minutes ago, Bombastinator said:

I will choose to keep pointing and laughing at 4k

pointing and laughing at what?

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, pas008 said:

pointing and laughing that they spent 400 yrs ago and still can play new games just fine on the couch with others inhouse so easily, fyi non console gamer here but seriously

 

I don't disagree with you. I play both. But with the new consoles coming out you can't help but wonder how big the difference of performance and quality is going to be. Maybe something like 20GB VRAM is an indication of that.

AMD Ryzen 9 5900X - Nvidia RTX 3090 FE - Corsair Vengeance Pro RGB 32GB DDR4 3200MHz - Samsung 980 Pro 250GB NVMe m.2 PCIE 4.0 - 970 Evo 1TB NVMe m.2 - T5 500GB External SSD - Asus ROG Strix B550-F Gaming (Wi-Fi 6) - Corsair H150i Pro RGB 360mm - 3 x 120mm Corsair AF120 Quiet Edition - 3 x 120mm Corsair ML120 - Corsair RM850X - Corsair Carbide 275R - Asus ROG PG279Q IPS 1440p 165hz G-Sync - Logitech G513 Linear - Logitech G502 Lightsync Wireless - Steelseries Arctic 7 Wireless

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Reytime said:

I don't disagree with you. I play both. But with the new consoles coming out you can't help but wonder how big the difference of performance and quality is going to be. Maybe something like 20GB VRAM is an indication of that.

or its another reason for nvidia to keep prices up on a couple model variants even though samsung supposedly was going to be cheaper than tsmc

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, pas008 said:

pointing and laughing that they spent 400 yrs ago and still can play new games just fine on the couch with others inhouse so easily, fyi non console gamer here but seriously

pointing and laughing at what?

 

The thing about 4K is it’s just overkill.  Iirc the point at which 4k becomes actually useful is something like 160” iirc.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Reytime said:

20GB VRAM would help with 4K gaming.

 

But really this way we can keep pointing and laughing at console gamers? Even after PS5 and Xbox Series X. 

Looking at https://www.tweaktown.com/tweakipedia/90/much-vram-need-1080p-1440p-4k-aa-enabled/index.html, increasing resolution doesn't dramatically increase the VRAM requirements.

 

The only thing that increasing screen resolution would affect is how big the render targets are. I don't have an exact number that a game may use, but seeing the data from https://www.guerrilla-games.com/read/killzone-shadow-fall-demo-postmortem, they claim about 800MB was used for render targets and the game was running at 1080p. Even if we bumped this up to 4K, you could claim "but now the render target size would balloon to 3200MB!", but at the same time, you could make the rendering pipeline use less render targets to achieve the same result. Also not all render targets are done at the target resolution.

 

Most of what's used in VRAM is texture data, as also seen in the slide deck of the link. So all the extra VRAM would be used at best is higher resolution textures.

Edited by Mira Yurizaki
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mira Yurizaki said:

Looking at https://www.tweaktown.com/tweakipedia/90/much-vram-need-1080p-1440p-4k-aa-enabled/index.html, increasing resolution doesn't dramatically increase the VRAM requirements.

 

The only thing that increasing screen resolution would affect is how big the render targets are. I don't have an exact number that a game may use, but seeing the data from https://www.guerrilla-games.com/read/killzone-shadow-fall-demo-postmortem, they claim about 800MB was used for render targets and the game was running at 1080p. Even if we bumped this up to 4K, you could claim "but now the render target size would balloon to 3200MB!", but at the same time, you could make the rendering pipeline use less render targets to achieve the same result. Also not all render targets are done at the target resolution.

 

Most of what's used in VRAM is texture data, as also seen in the slide deck of the link. So all the extra VRAM would be used at best is higher resolution textures.

Would you say a high res texture pack at 1080p would require the same amount of VRAM at 4K?

AMD Ryzen 9 5900X - Nvidia RTX 3090 FE - Corsair Vengeance Pro RGB 32GB DDR4 3200MHz - Samsung 980 Pro 250GB NVMe m.2 PCIE 4.0 - 970 Evo 1TB NVMe m.2 - T5 500GB External SSD - Asus ROG Strix B550-F Gaming (Wi-Fi 6) - Corsair H150i Pro RGB 360mm - 3 x 120mm Corsair AF120 Quiet Edition - 3 x 120mm Corsair ML120 - Corsair RM850X - Corsair Carbide 275R - Asus ROG PG279Q IPS 1440p 165hz G-Sync - Logitech G513 Linear - Logitech G502 Lightsync Wireless - Steelseries Arctic 7 Wireless

Link to comment
Share on other sites

Link to post
Share on other sites

16GB on 3080 Ti would either mean 512 bit bus, or 256 bit bus. 512 bit is unlikely due to difficulting routing traces (see r9 290 pcbs), and with higher frequency of GDDR6, and potentially PCIe 4.0. 256 bit bus would mean it would have less bandwidth that current gen flagship 2080 Ti. Also, the picture says 384 bit bus, and no memory manufacture makes 10.66 gigabit memory dies for a 16GB card w/ 384 bit bus.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Firewrath9 said:

16GB on 3080 Ti would either mean 512 bit bus, or 256 bit bus. 512 bit is unlikely due to difficulting routing traces (see r9 290 pcbs), and with higher frequency of GDDR6, and potentially PCIe 4.0. 256 bit bus would mean it would have less bandwidth that current gen flagship 2080 Ti. Also, the picture says 384 bit bus, and no memory manufacture makes 10.66 gigabit memory dies for a 16GB card w/ 384 bit bus.

I remember seeing 12gb as one claim at one point.  Dunno how or if that fits at all.  I also head something about some kind of hybrid ddr6/hdr thing at one point too.  We shall see what we shall see.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/30/2019 at 8:36 PM, HarryNyquist said:

If they are I'm gonna be mad AF cuz I literally just bought a 2080 super to replace my dead 1080Ti

Why would that make you mad?  GDC isn't until March and who knows when any new cards would be available.  I would just enjoy your new GPU, something new is always going to come out.  I wouldn't expect anything this year and if something new does come out I'll be pleasantly surprised. 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Reytime said:

20GB VRAM would help with 4K gaming.

 

But really this way we can keep pointing and laughing at console gamers? Even after PS5 and Xbox Series X. 

I'd really be more inclined to point and laugh at the gpus this generation. $400 on an RX 5700 XT or RTX 2060 Super just to get console parity is perverse, so I really hope the rumors of Ampere being a great architecture for gaming gpus holds and not the rumors about Ampere being only for Tesla and Quadro cards like Volta was. Because there is absolutely no chance I'm buying a gpu over the PS5 in 2020 if we just get crap Turing refreshes. I can't imagine we'll get decently priced AMD gpus this year since there will be huge demand for their wafers for Ryzen and the new consoles.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Mira Yurizaki said:

The only "perverse" thing I see about it is consoles use the hardware as a loss leader while PC hardware companies can't do that.

The huge jump in prices since Turing launched is the perverse part.

Link to comment
Share on other sites

Link to post
Share on other sites

I have 2x RTX Titan vram goes up to 13,5 gb Vram  in Final fantasy XV with SLI all to max settings in 4k, and i have 0% stutters a friend has 2x 2080TI and has stutters after vram is full all 15-30 mins randomly.

 

in Wolfenstein its the same with a Titan its super smooth with a 2080TI not in 4k @all max, even  in 8k its doesn't jerk with the titan

 

Now you can write what you want, but nothing change the fact that the 11GB is not enough.

Link to comment
Share on other sites

Link to post
Share on other sites

soooo does this mean the price of the RTX cards will tank anytime soon ???

Someone told Luke and Linus at CES 2017 to "Unban the legend known as Jerakl" and that's about all I've got going for me. (It didn't work)

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Nuada said:

I have 2x RTX Titan vram goes up to 13,5 gb Vram  in Final fantasy XV with SLI all to max settings in 4k, and i have 0% stutters a friend has 2x 2080TI and has stutters after vram is full all 15-30 mins randomly.

 

in Wolfenstein its the same with a Titan its super smooth with a 2080TI not in 4k @all max, even  in 8k its doesn't jerk with the titan

 

Now you can write what you want, but nothing change the fact that the 11GB is not enough.

Just out of curiosity, because it's been a while since I've played FFXV - something changed with SLI profiles for this game? I've been meaning to play again and I was wondering if the floating textures that require reload from time to time are still a thing.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 months later...

Now we just need to see what navi21 dishes out.  With these big cards it’s all about what they can do with games written for consoles.  Though this new rtx tensor core stuff may be interesting for machine learning people.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

I swear nivida will screw over its users again if they don't give these cards enough vram. the 9xx didn't have enough just like the 400/500 series.

if this and big navi are half as good as rumored I'm happy I've sat on my 580.

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Results45 said:
Spoiler

 

Largest Ampere leaks & analysis yet from Tom at Moore's Law is Dead:

 

As far as I can tell, more than half of it sounds very very plausible.

 

 

 

Posted this earlier today (from the video): 

 

Quote

 

force-has-no-perf-hit-with-rtx-on_full.thumb.png.8005225e31783ca03a3973b0fbfbef99.png

 

-next-gen-geforce-has-no-perf-hit-with-rtx-on_full.thumb.png.dfbd5e7f45f14e6aeb2d82cc5727226d.png

 

In a new video from YouTube channel Moore's Law is Dead, according to "exclusive insider info" secured by Tom -- NVIDIA's new Ampere cards are not just a die shrink of Turing with more RT Cores. It is not just the "Pascal version of Turing" and instead Ampere is a "multipurpose architecture".

 

One of the more exciting parts of the new Ampere rumors is that ray tracing performance is significantly better than Turing -- so much so that it reportedly offers 4x better performance per tier. This means a GeForce RTX 3060 will offer the same ray tracing performance as the flagship GeForce RTX 2080 Ti -- if not better.

 

The rumors do clarify that Turing will not age well when Ampere is here, with Tom reporting "Turing doing RT will be like Kepler doing DX12". 

 

Source: https://www.tweaktown.com/news/72400/nvidia-ampere-rumor-next-gen-geforce-has-no-perf-hit-with-rtx-on/index.html

 

Quote

 

72449_10_geforce-rtx-3080-ti-up-50-faster-2080-4k-gaming.thumb.jpg.5e6bbcfc5cff314c1e2e788aba4289b6.jpg

 

72449_11_geforce-rtx-3080-ti-up-50-faster-2080-4k-gaming.thumb.jpg.d9ccefee61bf70f077c6470728c4dfc8.jpg

 

These new rumored specs on GA102 have it packing 5376 CUDA cores on the Ampere architecture, 10% more IPC than Turing, and on the 7nm node that lets GPU clocks scale much higher to 2.2GHz and beyond. The lower-end Ampere GPUs will reach the dizzying heights of 2.5GHz.

 

But the memory specs on the GeForce RTX 3080 Ti have me enthused, with NVIDIA using 18Gbps GDDR6 memory which absolutely destroys with 863GB/sec of memory bandwidth. This is a 40% increase over the GeForce RTX 2080 Ti, and will see the GeForce RTX 3080 Ti being 40% faster in 4K gaming over unoptimized games, and up to 50% faster in 4K gaming in optimized games. Wow. Just, wow.

 

NVIDIA will reportedly be moving over to the new PCIe 4.0 standard, while the cooler will look "similar" to the RTX 20-series Founders Edition cards but it will have an upgraded triple-fan cooler. The design has been "simplified" with "less screws on the back of the card".

 

Source 2: https://www.tweaktown.com/news/72449/geforce-rtx-3080-ti-is-up-to-50-faster-than-2080-in-4k-gaming/index.html

Link to comment
Share on other sites

Link to post
Share on other sites

I know all this means it’s not exactly a fantastic time to build a PC but....it’s not like I have much of a choice 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Dang this thread was sure brought back from the dead, it's like a year old. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

All I care about is the release date. I want to build a beast of a PC in the September/October this year, given that the Ryzen 4000 will come out around that date as well. 

As it is looking right now, the release of RTX 3000 around that September/October would make the most sense. People will have money for it and NVIDIA has to release their cards before AMD releases theirs. Q1 2020 doesn't make sense at all, as people will be saving after holidays. 

Ryzen 4900X + 3080 Ti would be a lovely to have. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, D13H4RD said:

I know all this means it’s not exactly a fantastic time to build a PC but....it’s not like I have much of a choice 

Feeling the same way. I have the money, to buy 4,000 EUR PC, but to build it now, would be an insanity. 3Q/4Q should come out new GPUs, new Ryzen CPUs. I originally wanted to build the PC for the Cyberpunk 2077 release, but if I would have to wait a month or two for the new components, I would do that. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×