Jump to content

Nvidia: No next gen graphics cards for a long time

Wh0_Am_1
3 minutes ago, Tribalinius said:

The project is probably under a metric ton of NDA for all we know. I really doubt Intel waited for Raja to leave to start their project. I'm sure some of the groundwork has already been done by the time he signed with them. The faster Intel gets his GPU on the market, the better it will be for all of us anyway.

I partially agree with you, but for the sake of Intel I hope they aren't speeding through this. Larabee was quite the disaster and seeing it replicated would only seal Intel's fate.

Lappy: i7 8750H | GTX 1060 Max Q | 16Gb 2666Mhz RAM | 256Gb SSD | 1TB HDD | 1080p IPS panel @60Hz | Dell G5

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Princess Cadence said:

I sold my TITAN X Maxwell on September last year for about the price 1070's were going around 450~... bought the FE 1080 Ti second hand (literally just 3 weeks of use) from a closing down rendering studio in October just before the mining craze... paid 600 dollars for it.

 

I got lucky yeah... the longer it lasts now indeed the happier I'll be.

I got my ftw3 1080ti new for 800 a day or so before new years eve. I thought it was overpriced at 800 and felt like I paid too much for it. With in a couple days the price of the 1080ti was 1500 and I felt pretty lucky that I bought it when I did. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Brooksie359 said:

I thought it was overpriced at 800 and felt like I paid too much for it

Ok well what would been a price that would have been justified? ti's have always been in that price range. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mynameisjuan said:

Ok well what would been a price that would have been justified? ti's have always been in that price range. 

760 to 780. I mean the strix was about 20 to 30 dollars cheaper but it was out of stock actually most of the 1080tis were out of stock which at the time I didn't think much of it. I though it was due to people buying cards after Christmas but shortly after the lack of stock ended up skyrocketing the price. I am quite glad microcenter had any left when I got there because if I waited a day or so they likely wouldn't have any. 

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Valentyn said:

Everyone can get Volta now and stop waiting. Only the low low price of €3000

Volta has no performance per cuda gain over Pascal. It's all down to the die shrink and increased core count. GDDR6 might give you a minor bump beyond that, but the gain would be the same if you slapped GDDR6 on a Pascal card.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, GoldenLag said:

AMD, our lord and savior, please bestow upon us the blessing of competition

The problem is you (maybe not YOU specifically, but I suspect this to be the greater general issue) people shouting at AMD to make competition are giving them no incentive to chase after gamers because you don't buy their cards for gaming even when they do catch up enough for nVIDIA to respond. It's the consumer's fault just as much as it is AMD's fault.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SpencerC said:

Intel getting a gaming card out by 2020 is very optimistic. It takes a couple of years to design a GPU worth beans, and Intel has to be doubly sure that they won't cross any patents or copyrights held by either NVIDIA or AMD. 2022? That's a little bit better.

Take a look at this article from Forbes. It will actually hit in 2020.

Quote

 here is another one from digital trends.

Quote

 

In search of the future, new tech, and exploring the universe! All under the cover of anonymity!

Link to comment
Share on other sites

Link to post
Share on other sites

The year is 20XX and no new GPUs have yet to be released and all people play is Melee Fox only.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Okjoek said:

The problem is you (maybe not YOU specifically, but I suspect this to be the greater general issue) people shouting at AMD to make competition are giving them no incentive to chase after gamers because you don't buy their cards for gaming even when they do catch up enough for nVIDIA to respond. It's the consumer's fault just as much as it is AMD's fault.

I would buy an AMD card for my upgrade. But at the time of purchase there was a mining graze. As much as i wanted a AMD card there wasnt anything at the pricetier that was worth it. Im peobably going to end up with Navi or Vega 7nm if that ever touches the consumer market. Also im a cheapskate. I might end up getting a Vega 64 second hand or the 290x

Link to comment
Share on other sites

Link to post
Share on other sites

Well, on one hand, maybe AMD has a chance to release something competitive?

 

On the other, my 1060 has some breathing room again

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, maartendc said:

Why did Intel churn out the same old 4 core processors with 5% yearly improvements between 2012 and 2017? Because that was the best they could do? No way.

Yes, on the surface.

 

In actuallity, Intel's been working on uncore subsystems and new instruction sets, things that don't translate to across the board gains.

2 hours ago, maartendc said:

All of a sudden AMD releases competitive 6 core and 8 core CPU's, and BOOM a few months later intel steps up their game with Coffee Lake 6 core and 12 thread CPU's, and i9 with many more cores to compete with Threadripper.

The only things that changed was that Coffee got released prematurely, after Haswell through Kaby failed to deliver 6 cores at their target TDP and clocks, and that more Xeons got unlocked multipliers.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Wh0_Am_1 said:

So the question is, why is this happening, is Nvidia having issues or is the GTX 10 series just too successful for Nvidia to inovate?

Now Nvidia once again discussed Raytracing or Nvidia RTX , and one has to ask, how does Nvidia plan to power this because right now you apparently need 4 Tesla V100s 

To answer the question, I think it's the latter. Pascal in NVIDIA's eyes is "too successful" to release anything new.

 

With that said, I do think there's no game out there that has truly pushed Pascal to the very limits of what it can do. Final Fantasy XV Windows Edition may have pushed Pascal rather hard, but it's nowhere near what I think would show off Pascal's true capabilities beyond triple-digit framerates at max settings with GTX 1060's like what everyone here expects anyways.

 

Ray Tracing can already be done on an ancient architecture like Kepler. The best examples are OTOY's Brigade Engine, which does ray tracing at the global level using a pair of GeForce GTX TITAN cards in SLI (the original TITAN cards before the TITAN Black) for 1080p before factoring in time for optimisation to get it to run at 60 FPS, potentially with one card instead of two. This is far more demanding than doing ray tracing only on reflectives, refractives, and shadows.

 

So to answer "How does NVIDIA plan to power this thing...?", well, they can tune RTX to run on Pascal (down to the GeForce GTX 1060 6GB for good quality anyway), and at the absolute most, Maxwell (To the tune of a GeForce GTX 980). They can set up RTX to run on lower-end hardware, but you have to sacrifice on image quality to get it to run on something like a GeForce GTX 1050 Ti 4GB or a GeForce GTX 960 (both 2GB and 4GB models).

Edited by Jurunce

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Jurunce said:

To answer the question, I think it's the latter. Pascal in NVIDIA's eyes is "too successful" to release anything new.

 

With that said, I do think there's no game out there that has truly pushed Pascal to the very limits of what it can do. Final Fantasy XV Windows Edition may have pushed Pascal rather hard, but it's nowhere near what I think would show off Pascal's true capabilities beyond triple-digit framerates at max settings with GTX 1060's like what everyone here expects anyways.

 

Ray Tracing can already be done on an ancient architecture like Kepler. The best examples are OTOY's Brigade Engine, which does ray tracing at the global level using a pair of GeForce GTX TITAN cards in SLI (the original TITAN cards before the TITAN Black) for 1080p before factoring in time for optimisation to get it to run at 60 FPS, potentially with one card instead of two. This is far more demanding than doing ray tracing only on reflectives, refractives, and shadows.

 

So to answer "How does NVIDIA plan to power this thing...?", well, they can tune RTX to run on Pascal (down to the GeForce GTX 1060 6GB for good quality anyway), and at the absolute most, Maxwell (To the tune of a GeForce GTX 980). They can set up RTX to run on lower-end hardware, but you have to sacrifice on image quality to get it to run on something like a GeForce GTX 1050 Ti 4GB or a GeForce GTX 960 (both 2GB and 4GB models).

RTX needs the Tensor cores within the Volta to work efficiently, so that is not possible, unless they release new cards with tensor cores backed in.

In search of the future, new tech, and exploring the universe! All under the cover of anonymity!

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Wh0_Am_1 said:

RTX needs the Tensor cores within the Volta to work efficiently, so that is not possible, unless they release new cards with tensor cores backed in.

It does in its current form. What I'm getting at is they can "tune it" to run on Pascal as well to compete with alternative solutions like the Brigade Engine. This means they can make RTX run off of CUDA if need be. If solutions like that exist, then NVIDIA has competition to tackle.

 

Also you didn't have to quote the entire thing if the last paragraph of mine is being addressed :)

Edited by Jurunce

RIGZ

Spoiler

Starlight (Current): AMD Ryzen 9 3900X 12-core CPU | EVGA GeForce RTX 2080 Ti Black Edition | Gigabyte X570 Aorus Ultra | Full Custom Loop | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 1TB + 2TB M.2 NVMe PCIe 4.0 SSDs, 480GB SATA 2.5" SSD, 8TB 7200 RPM NAS HDD | EVGA NU Audio | Corsair 900D | Corsair AX1200i | Corsair ML120 2-pack 5x + ML140 2-pack

 

The Storm (Retired): Intel Core i7-5930K | Asus ROG STRIX GeForce GTX 1080 Ti | Asus ROG RAMPAGE V EDITION 10 | EKWB EK-KIT P360 with Hardware Labs Black Ice SR2 Multiport 480 | 32GB (4x8GB) Dominator Platinum SE Blackout #338/500 | 480GB SATA 2.5" SSD + 3TB 5400 RPM NAS HDD + 8TB 7200 RPM NAS HDD | Corsair 900D | Corsair AX1200i + Black/Blue CableMod cables | Corsair ML120 2-pack 2x + NB-BlackSilentPro PL-2 x3

STRONK COOLZ 9000

Spoiler

EK-Quantum Momentum X570 Aorus Master monoblock | EK-FC RTX 2080 + Ti Classic RGB Waterblock and Backplate | EK-XRES 140 D5 PWM Pump/Res Combo | 2x Hardware Labs Black Ice SR2 480 MP and 1x SR2 240 MP | 10X Corsair ML120 PWM fans | A mixture of EK-KIT fittings and EK-Torque STC fittings and adapters | Mayhems 10/13mm clear tubing | Mayhems X1 Eco UV Blue coolant | Bitspower G1/4 Temperature Probe Fitting

DESK TOIS

Spoiler

Glorious Modular Mechanical Keyboard | Glorious Model D Featherweight Mouse | 2x BenQ PD3200Q 32" 1440p IPS displays + BenQ BL3200PT 32" 1440p VA display | Mackie ProFX10v3 USB Mixer + Marantz MPM-1000 Mic | Sennheiser HD 598 SE Headphones | 2x ADAM Audio T5V 5" Powered Studio Monitors + ADAM Audio T10S Powered Studio Subwoofer | Logitech G920 Driving Force Steering Wheel and Pedal Kit + Driving Force Shifter | Logitech C922x 720p 60FPS Webcam | Xbox One Wireless Controller

QUOTES

Spoiler

"So because they didn't give you the results you want, they're biased? You realize that makes you biased, right?" - @App4that

"Brand loyalty/fanboyism is stupid." - Unknown person on these forums

"Assuming kills" - @Moondrelor

"That's not to say that Nvidia is always better, or that AMD isn't worth owning. But the fact remains that this forum is AMD biased." - @App4that

"I'd imagine there's exceptions to this trend - but just going on mine and my acquaintances' purchase history, we've found that budget cards often require you to turn off certain features to get slick performance, even though those technologies are previous gen and should be having a negligible impact" - ace42

"2K" is not 2560 x 1440 

Link to comment
Share on other sites

Link to post
Share on other sites

I bought my 1080 last November.  I suppose it's nice to know it's not obsolete yet :)

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Jurunce said:

It does in its current form. What I'm getting at is they can "tune it" to run on Pascal as well to compete with alternative solutions like the Brigade Engine. This means they can make RTX run off of CUDA if need be. If solutions like that exist, then NVIDIA has competition to tackle.

 

Also you didn't have to quote the entire thing if the last paragraph of mine is being addressed :)

They could, but "Tensor cores" are far more efficient at this task then "CUDA cores" so that tech would be expensive, and eat heavily into the performance of games, That is why Tensor cores are such a big deal. Because this next gen will have likely have more CUDA cores than the last, and separate tensor cores for raytracing which means that the tech will not interfere with your FPS, and the detail of objects.

In search of the future, new tech, and exploring the universe! All under the cover of anonymity!

Link to comment
Share on other sites

Link to post
Share on other sites

The ONE time i get myself to not buy ever new card,... the ONE time i manage to hold me back a generation.

 

Now i still have a 970 and want to upgrade for almost 2 years,... i only wanted to skip a single generation.

Who knew the generation would be like 4 years. Damn.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rattenmann said:

The ONE time i get myself to not buy ever new card,... the ONE time i manage to hold me back a generation.

 

Now i still have a 970 and want to upgrade for almost 2 years,... i only wanted to skip a single generation.

Who knew the generation would be like 4 years. Damn.

IKR?

In search of the future, new tech, and exploring the universe! All under the cover of anonymity!

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Wh0_Am_1 said:

IKR?

15 minutes ago, Rattenmann said:

The ONE time i get myself to not buy ever new card,... the ONE time i manage to hold me back a generation.

 

Now i still have a 970 and want to upgrade for almost 2 years,... i only wanted to skip a single generation.

Who knew the generation would be like 4 years. Damn.

 

Same here...

"To the wise, life is a problem; to the fool, a solution" (Marcus Aurelius)

Link to comment
Share on other sites

Link to post
Share on other sites

They should at least work on G-Sync 2.0, along with releasing more actual IPS G-Sync panels and none of that TN garbage. By actual, I mean monitors that support HDR with at least around 500 nit, 100HZ, 4k, 30 inches etc... these are all possible after seeing what they did with the Predator X27. If they feel the GPUs are good enough (a bit of more power would certainly be useful for 4k) they should get the monitors right. There are 100s of FreeSync monitors out there and the new LG monitors like the 32 inch ones are really impressive in the sense that they support HDR, %99 sRGB etc. G-Sync ones, not so many new monitors except two / three that cost $2000.

So 10s of monitors vs couple 100s of them. At least reduce G-Sync pricing, G-Sync monitors cost absurdly high compared to FreeSync.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Motifator said:

They should at least work on G-Sync 2.0, along with releasing more actual IPS G-Sync panels and none of that TN garbage. By actual, I mean monitors that support HDR with at least around 500 nit, 100HZ, 4k, 30 inches etc... these are all possible after seeing what they did with the Predator X27. If they feel the GPUs are good enough (a bit of more power would certainly be useful for 4k) they should get the monitors right. There are 100s of FreeSync monitors out there and the new LG monitors like the 32 inch ones are really impressive in the sense that they support HDR, %99 sRGB etc. G-Sync ones, not so many new monitors except two / three that cost $2000.

So 10s of monitors vs couple 100s of them. At least reduce G-Sync pricing, G-Sync monitors cost absurdly high compared to FreeSync.

Ugh... that is possible with current G-sync... Have you watched the latest LTT video?

In search of the future, new tech, and exploring the universe! All under the cover of anonymity!

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Wh0_Am_1 said:

Ugh... that is possible with current G-sync... Have you watched the latest LTT video?


What are we talking about here? To my knowledge, G-Sync currently doesn't support HDR unless you buy into that $2000 FALD panel maybe.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Wh0_Am_1 said:

Hey everyone! Nvidia has announced at Computex tonight that there will be no new GPU for a long time. This comes right after Jensen discussed the success of the GeForce cards as a distributed computing platform and the booming success of games such as PUBG and Fortnite.

So the question is, why is this happening, is Nvidia having issues or is the GTX 10 series just too successful for Nvidia to inovate?

Now Nvidia once again discussed Raytracing or Nvidia RTX , and one has to ask, how does Nvidia plan to power this because right now you apparently need 4 titan V100s 

and the Pascal generation of cards is already 2 years old, so how long can Nvidia make us wait before it starts to hurt? And when are we getting the 1180? I don't know about you guys but this is starting to get tedious.

[insert Oprah meme here, but sorry I cannot be bothered to fire up Gimp/Photoshop and animate it... but]

You get a 1080, you get a 1080... and you get a 1080!!! EVERYONE GETS A 1080!!!! [Forever... :( ]

But that's what I assume it to be, with mining and the economy, fabrication yields... everyone is expected to live off 1080s.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Motifator said:


What are we talking about here? To my knowledge, G-Sync currently doesn't support HDR unless you buy into that $2000 FALD panel maybe.

Sorry I meant yesterdays video about the 4k, 1000nit HDR, 144hz, G-sync monitor from ASUS. P.S. is loses HDR capabilities when pushed beyond 120hz

 

In search of the future, new tech, and exploring the universe! All under the cover of anonymity!

Link to comment
Share on other sites

Link to post
Share on other sites

You know that monitor costs at least $2000, right? Availability is an issue too. There are FreeSync 2 monitors that cost 1/3 of it which do HDR. Although not as well as that one, but they still sort of do it. Most people can't afford that monitor anyway. In Europe it costs upwards of €2500 with VAT, even.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×