Jump to content

NVIDIA GeForce 3070/3080/3080 Ti (Ampere): RTX Has No Perf Hit & x80 Ti Card 50% Faster in 4K! (Update 6 ~ Specs / Overview / Details)

16 minutes ago, WereCatf said:

I've been planning to buy a 3000 - series NVIDIA GPU anyways once they launch, regardless or despite any rumours. I'm skipping the 2000 - series entirely, upgrading from my GTX 1080. Can't say any of these attempts to hype things up do anything for me -- I know a 3070 or 3080 will be faster than my 1080 and will have some ray-tracing stuff, so I can give that at least a try.

 

I just don't give a fuck about how many cores this or megabytes that it has. The only thing I wish to know is whether AV1 will be among the supported codecs in NVENC or NVDEC.

Ugh I hate having my gpu die in the middle of this garbage Turing generation, was hoping my 970 would make it to Ampere. Can't decide if I want to go super cheap with a used RX 570 or get something halfway decent but overpriced in the 1660 Super. Neither will age worth a shit and I'll surely want to replace by the time Ampere is out, but I gotta buy something in this horrible gen.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, emosun said:

people are so excited about raytracing and I still don't get why dx10 is better than dx9

i would assume that is sarcasm, otherwise you could just google why dx10 is better than 9, or rather why dx11 is better, deferred rendering, global illumination and all the modern rendering techniques

Link to comment
Share on other sites

Link to post
Share on other sites

wanted to buy a new car, but i think my 3k buxs for the next xx80ti card

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, TVwazhere said:

So, this is ENTIRELY rumor, the sources are "itself" and WCCFTech, and they flat out admit that they know truly nothing about the GPU's...

 

?

Annoying isn't it?  You'd think after several years of WTFtech hedging their bets on rumor articles for clicks that the tech world would just ignore them as a source altogether.   I routinely dismiss out of hand all of their articles (because anything they post that is actually real can be found elsewhere) and have a list of several "journalists" whom I don't pay any attention to.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

This is pure speculation at best. It's the equivalent of a source, if you can even call them that, posting specs of the 2020 phones in 2019. You can take an educated guess but until the actual release it's a crap shoot. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Arika S said:

WCCFTech

 

We Can't Cease Fuckingmakingshitup Tech

We Can't Cease Fibbing Tech.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Arika S said:

WCCFTech

 

We Can't Cease Fuckingmakingshitup Tech

Is all that one Aussie word?

Link to comment
Share on other sites

Link to post
Share on other sites

I'm just hoping i can replace my rtx 2070 with something affordable that can actually run 3440x1440 at 60 fps stable.

Link to comment
Share on other sites

Link to post
Share on other sites

This will not happen, this is NVIDIA we are talking about, the company that defines overpriced GPUs now. It's wishful thinking to even pretend this will happen, VRam mabey, but its going to cost you, also why on earth would you need 16GB in a consumer GPU anyway. I'm just going to settle for a 2080Ti and stay with that for a few years.

My Current Build: https://uk.pcpartpicker.com/list/36jXwh

 

CPU: AMD - Ryzen 5 3600X | CPU Cooler: Corsair H150i PRO XT | Motherboard: Asus - STRIX X370-F GAMING | RAM: G.SKILL Trident Z RGB 2x8Gb DDR4 @3000MHz | GPU: Gigabyte - GeForce RTX 2080 Ti 11 GB AORUS XTREME Video Card | Storage: Samsung - 860 EVO 250GB M.2-2280 - Sandisk SSD 240GB - Sandisk SSD 1TB - WD Blue 4TB| PSU: Corsair RM (2019) 850 W 80+ Gold Certified Fully Modular ATX Power Supply | Case: Corsair - Corsair Obsidian 500D RGB SE ATX Mid Tower Case | System Fans: Corsair - ML120 PRO RGB 47.3 CFM 120mm x 4 & Corsair - ML140 PRO RGB 55.4 CFM 140mm x 2 | Display: Samsung KS9000 |Keyboard: Logitech - G613 | Mouse: Logitech - G703 | Operating System: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Leviathan- said:

How did you kill that 1080TI? 

No clue. I didn't have it overclocked or anything.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, HarryNyquist said:

If they are I'm gonna be mad AF cuz I literally just bought a 2080 super to replace my dead 1080Ti

Did you brick it? Becuase I will buy it then.

CPU: Ryzen 5800X3D | Motherboard: Gigabyte B550 Elite V2 | RAM: G.Skill Aegis 2x16gb 3200 @3600mhz | PSU: EVGA SuperNova 750 G3 | Monitor: LG 27GL850-B , Samsung C27HG70 | 
GPU: Red Devil RX 7900XT | Sound: Odac + Fiio E09K | Case: Fractal Design R6 TG Blackout |Storage: MP510 960gb and 860 Evo 500gb | Cooling: CPU: Noctua NH-D15 with one fan

FS in Denmark/EU:

Asus Dual GTX 1060 3GB. Used maximum 4 months total. Looks like new. Card never opened. Give me a price. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, BiG StroOnZ said:

Finally, NVIDIA is reportedly set to offer its next-gen GeForce RTX 3000 series at cheaper prices than the current-gen GeForce RTX 2000 series.

Bildresultat för x to doubt

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, emosun said:

people are so excited about raytracing and I still don't get why dx10 is better than dx9

I expect it's very difficult to find an actual answer to this via Google that actually has good information and not game benchmarks, so I'll write one here for anyone interested.

 

For some background, the DirectX API is a programming library designed by Microsoft that lets programmers interact with GPUs without needing to write specific operation binary for each GPU (which is infeasible as even GPUs by the same company may have completely different architectures - in addition the GPU architecture itself may be a trade secret so manufacturers don't want you coding with its binary).

 

Microsoft supplies a list of hardware features that a GPU needs to have to be awarded a specific DirectX compatibility version, and then Microsoft licenses the DirectX API to be implemented in the GPU driver. Hardware features could be "how many texture units are available" for effects like multi-texturing or "does this support floating point operations in programmable GPU shaders".

 

Rivals to DirectX include OpenGL and Vulkan, which do not have a license but both have the "minimum hardware features to be awarded a specific version" scheme - they're also extensible so if a hardware features are available, but not enough to be awarded a version number, developers can still use that feature - as a result OpenGL historically would be about 2 or 3 years ahead of DirectX in terms of features. Vulkan is about 6 months ahead of DX12 (Vulkan is updated about every 1 or 2 weeks, DX12 is about once every 6 months).

 

So in short, DX10 is basically a sign that says "these DirectX 10 GPUs all have this set of hardware features available" - DX9 introduced floating point shaders (which allows HDR rendering [note this isn't HDR display]) and DX10 introduced programmable geometry shaders (so the GPU can generate new triangles to be processed, previously the GPU would only draw triangles handed to it, it wouldn't generate new ones) and also introduced "feature levels" so GPUs without certain hardware features could still be DX10 compatible (DX11 expanded this to include some DX9 class hardware).

For programmers, DirectX 10 removes some of the old and less-used code of DX9 to make things more streamlined and simplified, so writing a DX10 game is a little easier than a DX9 game - but that only really effects the start of a project, a large game by a major game studio isn't influenced by this factor. DX11 improved this direction even more.

 

DX12 is a different beast to the DX APIs that came before it as it is Mantle derived, like Vulkan and Apple's Metal API, so DX12 is more about the programming side of things, than the hardware feature side of things.

Link to comment
Share on other sites

Link to post
Share on other sites

If these spec/Nm rumours end up being true, then the performance should be really nice.

 

The price is going to be the deciding factor here I for me. If they price it like their old Ti card (excluding 20** gen), then i'd be in serious consideration to upgrade here.

I will probably try my best to avoid 30**, much like 20**, if they decide to go with ~150%+ price for ~120% performance again. Getting over 130% performance against last gen for basically the same price made buying a new card really appealling.

 

The old Ti cards were no brainers if people had the money. Now, even though I have the money, the current gen cards just don't offer value enough for me to want to buy.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, OlympicAssEater said:

Bitcoin miners will destroy the msrp really quick. I hate those mf bitcoin miners ruining msrp and supply.

GPU mining hasn't caused pricing problems since Pascal, RTX prices weren't inflated due to mining but instead due to excessive 10 series stock and lack of competition

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Tedny said:

Dates?! 

Ampere is H2 2020. Samsung's node isn't even in full production yet.

Link to comment
Share on other sites

Link to post
Share on other sites

Also, as a note, Nvidia only taped out Ampere in March 2019. (We know because they threw a few parties for having finished it.) They've would only have gotten test silicon back recently, so what they're doing (as we see in the report) is hyping up vendors and Server Compute buyers. This product is almost a year away, right now, so Nvidia is pretty much just marketing. 

 

Also, Gen 1 RTX is a tech demo. Gen 2 might actually not introduce pipeline stalls and bottleneck itself, so it'll actually run at the rated "Gigarays". 

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Taf the Ghost said:

Also, as a note, Nvidia only taped out Ampere in March 2019. (We know because they threw a few parties for having finished it.) They've would only have gotten test silicon back recently, so what they're doing (as we see in the report) is hyping up vendors and Server Compute buyers. This product is almost a year away, right now, so Nvidia is pretty much just marketing. 

 

Also, Gen 1 RTX is a tech demo. Gen 2 might actually not introduce pipeline stalls and bottleneck itself, so it'll actually run at the rated "Gigarays". 

usually takes a yr from tapeout I thought if tapeout was march?

 

oops was a question

think pascal was a yr right?

Link to comment
Share on other sites

Link to post
Share on other sites

Guess I shouldn’t have upgraded to the 2080 Super... JK, did a trade up with my RTX 2070 so I basically got the 2080 Super for free so I’m good :) 

Corsair iCUE 4000X RGB

ASUS ROG STRIX B550-E GAMING

Ryzen 5900X

Corsair Hydro H150i Pro 360mm AIO

Ballistix 32GB (4x8GB) 3600MHz CL16 RGB

Samsung 980 PRO 1TB

Samsung 970 EVO 1TB

Gigabyte RTX 3060 Ti GAMING OC

Corsair RM850X

Predator XB273UGS QHD IPS 165 Hz

 

iPhone 13 Pro 128GB Graphite

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, pas008 said:

usually takes a yr from tapeout I thought if tapeout was march?

 

oops was a question

think pascal was a yr right?

A final "tapeout" is the final submitted silicon for implementation. They'll now get the first test parts back and work through them. Normally 4-5 months to get the first parts back, if everything is working well enough. Then you have to go through the testing process. How long the Tapeout to Launch takes isn't really predictable. If they have to "respin" a couple of times, it adds months.  We also don't know if Nvidia has all of the dies they're going to launch with complete, as some will come later in the process. 

 

In theory we could see a late Spring 2020 release for Nvidia, but, as I mentioned, Samsung is still in the Volume Ramp phase on their node. We also have no idea what a HPC GPU will look like on their node. Those clocks are going to matter a lot.

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, emosun said:

people are so excited about raytracing and I still don't get why dx10 is better than dx9

becasue performance and effects is different than a lighting 

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, HarryNyquist said:

No clue. I didn't have it overclocked or anything.

Did you not have an aftermarket card like EVGA for an extended warranty? It's still a good card and I wouldn't give up on it just yet. Just my 2 cents :)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×