Jump to content

Pascal consumer = Maxwell on speed confirmed

Agosto
6 minutes ago, Briggsy said:

uses 61.7% less power compared to an R9 390.

 

uses 74.4% less power compared to a GTX 980.

It uses 61.7/74.4% of the power or 38.3/26.6% less power* 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, AluminiumTech said:

think of VEGA as big Polaris because it will just be bigger Polaris with GDDR5X or HBM2.

nah, VEGA is "polaris+".... it will have some additional efficiency gains that will make the perf per watt between maxwell and pascal. relative to performance ofcourse

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AluminiumTech said:

think of VEGA as big Polaris because it will just be bigger Polaris with GDDR5X or HBM2.

Why did you reply to me?

[Out-of-date] Want to learn how to make your own custom Windows 10 image?

 

Desktop: AMD R9 3900X | ASUS ROG Strix X570-F | Radeon RX 5700 XT | EVGA GTX 1080 SC | 32GB Trident Z Neo 3600MHz | 1TB 970 EVO | 256GB 840 EVO | 960GB Corsair Force LE | EVGA G2 850W | Phanteks P400S

Laptop: Intel M-5Y10c | Intel HD Graphics | 8GB RAM | 250GB Micron SSD | Asus UX305FA

Server 01: Intel Xeon D 1541 | ASRock Rack D1541D4I-2L2T | 32GB Hynix ECC DDR4 | 4x8TB Western Digital HDDs | 32TB Raw 16TB Usable

Server 02: Intel i7 7700K | Gigabye Z170N Gaming5 | 16GB Trident Z 3200MHz

Link to comment
Share on other sites

Link to post
Share on other sites

People still ignore one of the elephants in the room. AMD will always have a higher hard floor to their power draw due to the inclusion of a hardware scheduler. Big help for certain workloads, big detriment to power draw crowns, and I wonder if that extra draw and work performed on die means it also has a harder time keeping temps down? @patrickjp93 I was going to ask that of you in another thread but forgot, does the hardware scheduler put other stresses on the GPU besides just greater power draw?

 

Now independent of the scheduler the nVidia uArch is probably still a more efficient design "core" for "core". It clocks higher, and pushes a lot of brute force performance. AMD uArch for a while now has seemed to be more about torque than horsepower. Keeping revs low and moving big loads by focusing on the needs of difficult tasks over revving to 10k RPM and smoking tires.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, HalGameGuru said:

People still ignore one of the elephants in the room. AMD will always have a higher hard floor to their power draw due to the inclusion of a hardware scheduler. Big help for certain workloads, big detriment to power draw crowns, and I wonder if that extra draw and work performed on die means it also has a harder time keeping temps down? @patrickjp93 I was going to ask that of you in another thread but forgot, does the hardware scheduler put other stresses on the GPU besides just greater power draw?

 

Now independent of the scheduler the nVidia uArch is probably still a more efficient design "core" for "core". It clocks higher, and pushes a lot of brute force performance. AMD uArch for a while now has seemed to be more about torque than horsepower. Keeping revs low and moving big loads by focusing on the needs of difficult tasks over revving to 10k RPM and smoking tires.

No, but AMD has other problems too, such as weaker, fewer ROPs it has to address too.

The problem with AMD's strategy is that leaves it weaker in HPC environments where the chief driving purchase metric isn't performance/$ or performance/watt. It's performance/watt/$. Nvidia's platform just provides more for your money as a GPGPU algorithm engineer. Not only do you have all the original 32-bit hardware, but you also get the dedicated 64-bit SPs, and that allows a vastly more efficient design than to have the control logic that snaps 2 32-bit SPs together into a 64-bit-capable SP for AMD. AMD is not famous for properly addressing the problems it faces. Just throw more complex hardware at the problem instead of addressing individual problems...

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, patrickjp93 said:

No, but AMD has other problems too, such as weaker, fewer ROPs it has to address too.

The problem with AMD's strategy is that leaves it weaker in HPC environments where the chief driving purchase metric isn't performance/$ or performance/watt. It's performance/watt/$. Nvidia's platform just provides more for your money as a GPGPU algorithm engineer. Not only do you have all the original 32-bit hardware, but you also get the dedicated 64-bit SPs, and that allows a vastly more efficient design than to have the control logic that snaps 2 32-bit SPs together into a 64-bit-capable SP for AMD. AMD is not famous for properly addressing the problems it faces. Just throw more complex hardware at the problem instead of addressing individual problems...

Weaker in what sense? They are better fed with memory bandwidth.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, AluminiumTech said:

Weaker in what sense? They are better fed with memory bandwidth.

Actually they've been perfectly well fed. They're actually overfed/under pressure. Fury/X was actually a classic case of a design bottleneck of GCN. The ROP:SP ratio is not optimal for the strength of ROPs AMD provides. Nvidia provides roughly double the number of ROPs for each SP, allowing it to get away with fewer, faster SPs while Delta Color Compression and other tech makes up for the lesser bandwidth relative to AMD's offerings.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, laminutederire said:

Dude, the rx 480 already owns the 970, do your research before trying to start a flame war.

Theres a difference between owns and crushes

I'm a educated fool with money on my mind.

They say i got to learn but nobody here to teach me,if they can't understand it how can they reach me

Power and the money,money and the power,minute after minute,hour after hour

My Motivation

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Dresta said:

Theres a difference between owns and crushes

It's halfway between a 970 and 980, so that counts as crushing as well in my dictionary 

Link to comment
Share on other sites

Link to post
Share on other sites

Regardless that the architecture didn't really change and Pascal is just a node shrink, I'm still impressed with Nvidia. They could have just followed the tradition of adding 10-15% more cores and end up with 10-15% more performance. But they gave us more than 10-15%.

 

Compare that to AMD, shrinking the process and adding exactly the expected 10-15% performance boost.

 

I can only imagine AMD got a real surprise when the 1080 was revealed. Even with their own Vega on the horizon.

 

 

Ryzen 7 2700x | MSI B450 Tomahawk | GTX 780 Windforce | 16GB 3200
Dell 3007WFP | 2xDell 2001FP | Logitech G710 | Logitech G710 | Team Wolf Void Ray | Strafe RGB MX Silent
iPhone 8 Plus ZTE Axon 7 | iPad Air 2 | Nvidia Shield Tablet 32gig LTE | Lenovo W700DS

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, don_svetlio said:

There are more than a dozen games out and dozens more on the way using DX12. Vulkan is yet to arrive in full force but DX12 is here. Sorry but Nvidia's Pascal is not ready for it. Volta likely will be.

By the time DX12 and Vulkan are required, Pascal will be ancient anyway.  It's not like devs are going to force people to use DX12 immediately, do you remember how long it took for DX11 to be required?

 

Even if Pascal doesn't last once DX12 and Vulkan are mainstream it won't matter, because they won't be fast enough to run the games anyway.  By 2019 we're going to have new consoles and a 1080 will be the equivalent of a 680 today.

System:  i7-6700k @ 4.5 GHz Gigabyte G1 Gaming GTX 1080 16GB Corsair LPX DDR4-3000 | Gigabyte GA-Z170N Gaming 5 ITX 

Corsair SF600 SFX PSU Seagate 2TB HDD, 1TB SSHD | Samsung 840 Evo 120GB Boot DriveSilverstone ML08B-H |

Link to comment
Share on other sites

Link to post
Share on other sites

AMD-Next-Gen-Vega-GPU-and-Navi-GPU-2017-

 

This Roadmap clearly shows Vega isn't gonna be just bigger Polaris chip. As you can see the Perf/Watt is significantly higher than Polaris, HBM is gonna account for some of it but not all of it I'm sure as its close to 1.5x-2x increase.
Polaris 11 was the main focus with Polaris "AMD stated that in interview with PCPer" whats why its scales poorly with clocks from power consumption point of view.
With Vega this shouldn't be a problem, well at least I hope.

Slowly...In the hollows of the trees, In the shadow of the leaves, In the space between the waves, In the whispers of the wind,In the bottom of the well, In the darkness of the eaves...

Slowly places that had been silent for who knows how long... Stopped being Silent.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Autoimmunity said:

By the time DX12 and Vulkan are required, Pascal will be ancient anyway.  It's not like devs are going to force people to use DX12 immediately, do you remember how long it took for DX11 to be required?

 

Even if Pascal doesn't last once DX12 and Vulkan are mainstream it won't matter, because they won't be fast enough to run the games anyway.  By 2019 we're going to have new consoles and a 1080 will be the equivalent of a 680 today.

It took exactly 2 years for DX11 to be a big factor. How long have we had DX12? Oh look, a year. Next year it will be a big thing and people buying 480/1060s are coming from 760s/280s - those people want LONGEVITY - something Pascal does not have.

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, TrigrH said:

if I had to guess, I think it will win (since its a big 14nm die) but not on power efficiency which isn't a major issue given the target market for vega. 

We really don't know yet about efficiency.  From what I've seen,  it looks like amd may finally be ditching ancient GCN  in favor of a new architecture.  Because Polaris may be quite difficult than tahiti,  it still is GCN at it's core for better or for worse.  The weaknesses of the architecture can be reduced somewhat,  but you can't go on forever with adding features to 2011 tech.  

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, don_svetlio said:

It took exactly 2 years for DX11 to be a big factor. How long have we had DX12? Oh look, a year. Next year it will be a big thing and people buying 480/1060s are coming from 760s/280s - those people want LONGEVITY - something Pascal does not have.

You know I'd accept that statement if Pascal didn't have longevity. But it clearly does.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, don_svetlio said:

It took exactly 2 years for DX11 to be a big factor. How long have we had DX12? Oh look, a year. Next year it will be a big thing and people buying 480/1060s are coming from 760s/280s - those people want LONGEVITY - something Pascal does not have.

DX11 wasn't that huge of a change though. Especially for game devs (all hardware and api features from dx10.1 were kept and on top of those were added new features for new functionality). The main things it brought were hardware tessellation, GPGPU support and better multi threading.

 

I think it's going to take a little longer for game devs to switch to DX12. It should also be reminded that, for or a lot of (smaller) games out there, the new features it brings simply aren't necessary and using DX11 would do just the same thing quite a lot easier, with little to none difference in performance.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Curufinwe_wins said:

You know I'd accept that statement if Pascal didn't have longevity. But it clearly does.

We shall speak again next year. Feel free to bookmark this thread

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Curufinwe_wins said:

 

Yea you are just blatantly wrong here. There were 0 hardware path changes to the die or anything else (other than the use of newer higher quality components).

 

There were software optimizations (quite similar to the 480x patch actually) that along with the extremely mature process meant card by card performance was MUCH more uniform but the best 390(x) cards paled in comparison to the best 290x 8GB models (consistently reaching 1.15 was fairly easy, but only MSI and XFX models could even more than occasionally go above that.)

 

The biggest single improvement was the tessellation and color compression formulas that were brought over from Tonga. These improvements did make their way to the 290(x) variants a few weeks later (mid august 2015).

 

If you read any of the anandtech article, you would know otherwise. There are quite a few changes to the front end of pascal vs Maxwell.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Tech Inquisition said:

TL:DR...

 

Agreed...

Even maxwell to a degree has some longevity (Partially because full DX12.1 support, and the fact that it seems very similar to Pascal). Something kepler did not have. Older AMD cards will if AMD decides to work on their software and not drop new drivers for 200 and 300 series cards (which thankfully they haven't). Aside from that DX12 and Vulkan has proven that a lot more performance can be squeezed out of AMD cards. Everyone wins, AMD users get equal or better performance than Nvidia cards in the future, and Nvidia owners have nothing to worry about.

 

Keep in mind, and this is for all graphics cards, Nvidia or AMD. The truth is games aren't going to get any worse performing. Only better. So as we see games get more efficient, these older cards will last longer no matter what. And lets be honest, games don't seem to get any better looking by a huge amount. Crysis still looks better than most games released recently for reference. So please, stop trying to cause a fucking panic and have people going "OH NO MY GRAPHICS CARD WON'T RUN GAMES IN 3 YEARS".

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, PerfectTemplar said:

Evidently, it wasn't meant to compete with the 1060 either.

 

 

970* performance.

It's not 970 performance. Closer to 980 than it is 970.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Autoimmunity said:

By the time DX12 and Vulkan are required, Pascal will be ancient anyway.  It's not like devs are going to force people to use DX12 immediately, do you remember how long it took for DX11 to be required?

 

Even if Pascal doesn't last once DX12 and Vulkan are mainstream it won't matter, because they won't be fast enough to run the games anyway.  By 2019 we're going to have new consoles and a 1080 will be the equivalent of a 680 today.

actually, the reason DX11 took so long was because the adoption rate between XP/Vista and Win7 took forever.

 

With Win10 however the adoption rate has been unprecedented, and as such, the speed at which developers can start to target DX12 is higher. They do not have to wait 2-3 years this time for Win10 to reach enough market share for them to risk "locking" their game to DX12. I recon already in a year we will see a vast majority of games starting to be DX12 only, as the adoption rate of Win10, thus the requirement for the OS to be able to run DX12, is so high.

 

http://store.steampowered.com/hwsurvey/directx/

 

look at this. Win10 ALREADY has majority share. among the operating systems. I recon A LOT of people jumped onto the free upgrade this week as it was their last chance. So the July HwSurvey should see this market share increase even further.

 

A game developer is only concerned about "target market %". meaning how many % of the market their game can adress with the requirements outlined for the games.

 

Due to a rapid increase in adoption of more powerful hardware lately, we have seen games jump from using GT8800's in their minimum requirements, to HD 7770s and GTX 660s.. A MAJOR improvement from how it used to be. Especially the CPU requirements has improved, as more people now then ever is using a quad core CPU at minimum. Thus the adressable core count, and as such the adressable performance tier, is actually there.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Dresta said:

Intel started going Nvidia's way,i mean releasing a $1700 CPU with just 2 cores more and some features which many people are not even gonna use is just stupid

 

Yea after about 1 year of drivers,custom coolers it will be the 980 level or even better maybe even a 980 Ti if gaben helps us

After a YEAR of drivers and custom coolers (citation needed desperately)...

 

The GTX 1060 TODAY is 980 level of performance and is the exact same price as even the 4GB RX480 (at least in Australia.) 

i7 6700K - ASUS Maximus VIII Ranger - Corsair H110i GT CPU Cooler - EVGA GTX 980 Ti ACX2.0+ SC+ - 16GB Corsair Vengeance LPX 3000MHz - Samsung 850 EVO 500GB - AX760i - Corsair 450D - XB270HU G-Sync Monitor

i7 3770K - H110 Corsair CPU Cooler - ASUS P8Z77 V-PRO - GTX 980 Reference - 16GB HyperX Beast 1600MHz - Intel 240GB SSD - HX750i - Corsair 750D - XB270HU G-Sync Monitor
Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, PerfectTemplar said:

After a YEAR of drivers and custom coolers (citation needed desperately)...

 

The GTX 1060 TODAY is 980 level of performance and is the exact same price as even the 4GB RX480 (at least in Australia.) 

That's weird - the people I know from Aus told me that the 1060 is ludicrously expensive compared to the 480. Can I have a link to the 1060 for 300$ AUS or less?

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Prysin said:

actually, the reason DX11 took so long was because the adoption rate between XP/Vista and Win7 took forever.

 

With Win10 however the adoption rate has been unprecedented, and as such, the speed at which developers can start to target DX12 is higher. They do not have to wait 2-3 years this time for Win10 to reach enough market share for them to risk "locking" their game to DX12. I recon already in a year we will see a vast majority of games starting to be DX12 only, as the adoption rate of Win10, thus the requirement for the OS to be able to run DX12, is so high.

 

http://store.steampowered.com/hwsurvey/directx/

 

look at this. Win10 ALREADY has majority share. among the operating systems. I recon A LOT of people jumped onto the free upgrade this week as it was their last chance. So the July HwSurvey should see this market share increase even further.

 

A game developer is only concerned about "target market %". meaning how many % of the market their game can adress with the requirements outlined for the games.

 

Due to a rapid increase in adoption of more powerful hardware lately, we have seen games jump from using GT8800's in their minimum requirements, to HD 7770s and GTX 660s.. A MAJOR improvement from how it used to be. Especially the CPU requirements has improved, as more people now then ever is using a quad core CPU at minimum. Thus the adressable core count, and as such the adressable performance tier, is actually there.

 

 

There is kind of a problem with DX12 at the moment however-its apparently a buggy mess.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×