Jump to content

Nvidia Claims 3X Improved Performance In Certain Applications With New Driver Update for Titan Xp (Vega Response?)

Max_Settings
8 hours ago, RagnarokDel said:

 

If Nvidia actually supported freesync, it would prevent a lot of non-cryptomining market share loss to AMD.

Nvidia wants you to buy an expensive GSYNC monitor and then continue to buy Nvidia GPU's for the lifespan of that monitor instead.

 

Come upgrade time those of us with these monitors get to decide if we want to buy the video card we'd prefer or the one compatible with our $600+ monitors.

 

 

 

 

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Morgan MLGman said:

Huh, feature level? Then how can one say that their GPUs "fully" support DX12 if they don't support all of its features?

Again, because DirectX 12 is not in any way related to feature level 12 other than that the specs both released at roughly the same time.

 

Feature level 12 is a DX10/11 thing, as much as a DX 12 thing. Pascal supports all of the mandatory features for feature level 12_0 and 12_1, and all of the optional features other than "Stencil reference value from Pixel Shader" and Tier-2 of the resource heap capabilities. The non-titan cards also don't support reduced precision (16-bit and 10-bit) shaders, which again is an optional feature in 11_0, 11_1, 12_0, and 12_1.

 

There are no cards that support everything from feature level 12_1, and by the time there is, there will very likely be a feature level 12_2 specifically because feature levels are designed to be a moving goal for hardware developers. But more importantly there's no reason for a desktop graphics card to support everything optional from 12_1 or even 12_0 since a number of the features in those levels will likely never see use outside of mobile.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Max_Settings said:

This is only for the actual Titan Xp, not the Titan X (pascal). And it's not in games, only certain professional applications.

it's for both:

  • Titan X
  • Titan Xp

they basically enabled Quadro functionality on Titans

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Valentyn said:

Looks like the Titan X P also got decent increases!

User at HardOCP tested with the latest drivers.

https://hardforum.com/threads/vega-rumors.1933112/page-86#post-1043139625

Screen_Shot_2017-08-01_at_7.png

these numbers defiantly looks like a response to Vega FE seeing how they are now relativity even now.

 

lol that 600% gain.

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, LAwLz said:

Hahah yeah, fuck Nvidia for releasing drivers that increase performance, right?

I hate when my graphics cards gets better with newer drivers. They really should stop doing that.

/sarcasm (you'd think that it wouldn't be needed, but I wouldn't be surprised if some AMD fans genuinely thought this way)

 

 

Maybe I am misinterpreting you, but it sounds like you've already made your mind up on what you're going to write in your review, before you even have the cards so you can test them. Having predefined conclusions is a very bad thing to have. It will inevitably lead to unconscious bias (because you want to verify that you're correct, instead of looking at evidence and then make your mind up).

no, its my initial hypothesis. If nvidia proves better than so be it. Nvidia cuts a lot from their gtx line even their titans but AMD only nerfs the double precision performance of their consumer cards so AMD is looking good at the moment as far as some architectural nerfs go. fp16 for instance is nerfed on the gtx line which some games and graphics can use to reduce memory and processing use.

 

However my test isnt a performance test of games, rather the professional applications that are useful to me, and some architectural benchmarks to show some of the numbers not shown (like int8, fp16), and if i have time write a test code to test other instructions.

 

Im hoping that AMD will do well this time because it would help competition and keep the prices in check. Because AMD didnt have anything decent nvidia were able to sell their current line more with no benefit especially the titans, so this test is to compare the titan to regular gtx, to AMD's equivalent and to an older AMD GPU that doesnt have any architectural nerfs or cuts to it.the only thing the pascal titan has over its geforce are more ROPs, TMUs, L2 cache and 1GB more vram but still retains the various nerfs like lowered fp16 performance, half duplex bus (which can reduce performance for nvidia optimus and eGPU solutions especially at higher resolutions).

 

Right now im just waiting for my AMD GPU to arrive and setting up the test. Its not going to be a great test as im using the GPUs externally but i will try to cover as many common professional applications. Currently 1 major loss for nvidia is not giving the same features for the titan xp to the titan x pascal as its still a workstation card as there are still plenty of workstations with titan x pascals out there. Its only been a year.

Link to comment
Share on other sites

Link to post
Share on other sites

Awfully suspicious if you ask me.  I mean, I suppose it's possible that they discovered a flaw and were able to legitimately increase performance that much, but I feel like it's more likely they knew how to do this all along and were just intentionally holding it back.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

It looks like nvidia was holding back performance for professional applications, they did so in the past so you'd buy a quadro. Some applications in the past would only accelerate on quadro or tesla and not accept geforce.

 

none of the softwares however seem relevant to me. Blender and openCL programming are my main.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, sgloux3470 said:

Nvidia wants you to buy an expensive GSYNC monitor and then continue to buy Nvidia GPU's for the lifespan of that monitor instead.

 

Come upgrade time those of us with these monitors get to decide if we want to buy the video card we'd prefer or the one compatible with our $600+ monitors.

 

 

 

 

Nvidia is like if phone carriers sold GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, RagnarokDel said:

Nvidia is like if phone carriers sold GPUs.

Nah, not nearly that bad.  To truly match phone carriers, they'd have to also charge a subscription fee, force you to install various other software on your PC, disable features of it if you don't pay extra, update drivers to be compatible with new games only 8 months after their release, and charge you a fee every time you change cards.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

some nVidia cards got better results with new drivers - fuck that, boo nVidia

some Radeon cards get better results with new drivers - AMD FineWineTM

 

because logic

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, zMeul said:

some nVidia cards got better results with new drivers - fuck that, boo nVidia

some Radeon cards get better results with new drivers - AMD FineWineTM

 

because logic

The difference being that Finewine is from driver optimization. This is from nVidia ungimping performance that they purposefully gimped in the first place.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, zMeul said:

it's for both:

  • Titan X
  • Titan Xp

they basically enabled Quadro functionality on Titans

More like re-enabled what the Titans had which got removed and is now back :). It's good that the card actually is what it's marketed as now though.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ravenshrike said:

The difference being that Finewine is from driver optimization. This is from nVidia ungimping performance that they purposefully gimped in the first place.

It does sure seem that way :/

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

Imagine a world were Nvidia is the only company making GPU's. Do you think they would have enabled fp16 on pro-consumer card ? or they would they have kept shafting people to get the Quadros, you answer that. Btw they can enable it on all high end pascal gaming cards snice its the same die but i doubt they well since Volta is near, so why enable feature that you can charge more for later down the line.

Slowly...In the hollows of the trees, In the shadow of the leaves, In the space between the waves, In the whispers of the wind,In the bottom of the well, In the darkness of the eaves...

Slowly places that had been silent for who knows how long... Stopped being Silent.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, 3DOSH said:

Imagine a world were Nvidia is the only company making GPU's. Do you think they would have enabled fp16 on pro-consumer card ? or they would they have kept shafting people to get the Quadros, you answer that. Btw they can enable it on all high end pascal gaming cards snice its the same die but i doubt they well since Volta is near, so why enable feature that you can charge more for later down the line.

 

I think many people don't understand just how much profit companies the size of nvidia have to make to remain viable.   To the average consumer billions in profit looks ugly and they feel it is only generated through unmitigated greed.  But the reality is that nearly all companies need massive profits in order to assure viability long term.  AMD is very lucky they didn't go bankrupt or get bought out in the last few years.   Disabling or delaying features in a product is not greed, it is essentially product revenue management.   What happens if you give your customers a product they don't need to upgrade in the next 5 years?  A. you don't have any revenue for the next 5 years. 

 

Competition is good for customers, don't get me wrong, but just don't make the mistake of assuming unnecessary greed is the motive behind a companies (any company) decisions.

 

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, TidaLWaveZ said:

I wish AMD would stand a little bit closer so their breath would be more noticeable.

If Nvidia doesn't notice the fan on Vega FE, I'd be very surprised. 

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, rrubberr said:

So to clarify, there is no point updating the driver if you're using older Titans? When I search for Titan Z drivers on Nvidia's website, it shows this one as compatible. Why did they include the whole lineup of cards in the update compativility if it was just meant for XP?

Because there's probably other minor driver updates with it. The performance boost itself however is strictly the newer cards.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Billy_Mays said:

Yeah that's the problem AMD should have made it lots more better than a Titian X so then Nvidia would start shitting bricks

But is that really worth it? As far as I can tell Vega doesn't have fp64 hardware on board which means with Vega as the architecture you would probably be looking at a 650 mm^2+ die in order to beat the 1080 ti, not to mention there's extra costs from interposer and stuff which would probably make Vega expensive af. The top end gpu would probably be in the range of $900 at least. Who is gonna want to pay that?

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, ravenshrike said:

As soon as they saw that AMD had their head screwed on straight again with Ryzen and how Intel got suckerpunched nVidia started to move. That's why that MCM whitepaper with MIT came out recently. They do not want to play the part of Intel. As it is, assuming Navi is a success in its MCM implementation similar to Ryzen then you should see AMD able to saturate the GPU markets from low to high end with little resistance. IPC is a much lesser concern on GPUs. nVidia has started to move now so they're not left out in the cold for 3+ years. However they have got to be pumping a shitload of money into that particular R&D program if they want a successful implementation taped out within the next 2 years.

Nvidia has been moving for years. Maxwell, Pascal, that stuff has probably been in development since 2011.

 

You also have to remember doing an MCM implementation is not perfect. Latency sees a jump due to having more stuff to jump through. We may see inconsistency, just like how we sometimes do on Ryzen and epyc.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Max_Settings said:

almost another doubling in performance like we saw from Maxwell to Pascal.

That seems....high.  Do you have some benchmarks from then to show this?  I don't recall the performance jump being that vast, unless you're referring to professional applications, not games.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DocSwag said:

Nvidia has been moving for years. Maxwell, Pascal, that stuff has probably been in development since 2011.

 

You also have to remember doing an MCM implementation is not perfect. Latency sees a jump due to having more stuff to jump through. We may see inconsistency, just like how we sometimes do on Ryzen and epyc.

And a GPU needs a shit load more bandwidth than the 30GB/s in Zen.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, zMeul said:

some nVidia cards got better results with new drivers - fuck that, boo nVidia

some Radeon cards get better results with new drivers - AMD FineWineTM

 

because logic

There's a strong perception that Nvidia was just sitting on this update until they had a reason to release it.  Most likely it's optimizations that were already on the Quadro line that they are happening to bequeath to the Titan card.  

 

Whether that's true or not is only something Nvidia knows.  

 

 

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×