Jump to content

Rumor: Nvidia’s Pascal Architecture Is In Trouble With Asynchronous Compute

Mr_Troll

Even with this news I could never buy a GPU from a company that is spearheaded by Richard Huddy and Roy Taylor.

CPU: Intel Core i7 7820X Cooling: Corsair Hydro Series H110i GTX Mobo: MSI X299 Gaming Pro Carbon AC RAM: Corsair Vengeance LPX DDR4 (3000MHz/16GB 2x8) SSD: 2x Samsung 850 Evo (250/250GB) + Samsung 850 Pro (512GB) GPU: NVidia GeForce GTX 1080 Ti FE (W/ EVGA Hybrid Kit) Case: Corsair Graphite Series 760T (Black) PSU: SeaSonic Platinum Series (860W) Monitor: Acer Predator XB241YU (165Hz / G-Sync) Fan Controller: NZXT Sentry Mix 2 Case Fans: Intake - 2x Noctua NF-A14 iPPC-3000 PWM / Radiator - 2x Noctua NF-A14 iPPC-3000 PWM / Rear Exhaust - 1x Noctua NF-F12 iPPC-3000 PWM

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, i_build_nanosuits said:

all that i knew already, a GPU is a very specific and complex piece of engineering and they will be better at some tasks vs others, obviously.

that's why some games favor AMD, and most games favor nvidia hardware.

And if you knew that you'd known amd did far more than just increase core count.

 

Most games favor current generation Nvidia because Nvidia built the hardware to exceed at just one fucking thing.

 

Maxwell is about as multi-purpose as a fucking nosehair-trimmer.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Citadelen said:

I didn't know Total War: Warhammer was using Async, that's great, maybe I'll be able to play it at more than 20FPS. :D

Yup: http://www.guru3d.com/news-story/amd-announces-directx-12-game-engine-developer-partnerships.html
 

Quote

Total War: WARHAMMER
A fantasy strategy game of legendary proportions, Total War: WARHAMMER combines an addictive turn-based campaign of epic empire-building with explosive, colossal, real-time battles, all set in the vivid and incredible world of Warhammer Fantasy Battles. 
Sprawling battles with high unit counts are a perfect use case for the uniquely powerful GPU multi-threading capabilities offered by Radeon graphics and DirectX 12. Additional support for DirectX 12 asynchronous compute will also encourage lightning-fast AI decision making and low-latency panning of the battle map.

 

 

 

1 minute ago, i_build_nanosuits said:

well then there you go...my WHOLE point to this thread i guess was this:

AMD peasant, don't slap it too high when you see that nvidia cards does not support async compute well...because in the end NVIDIA HAS THE MONEY and the MARKET SHARE and game developers develop games to appeal to the MASSES...so you should be VERY sad if pascal is not good at Async compute cause that would mean that you won't see that many games taking full advantage of it. end of the story.

AMD peasant? You think you are above other people because you bought the third most expensive NVidia card? You've proved you don't understand how architectures work or what differences even the newest ones have.

 

You know who has a larger market share than NVidia? PS4 and XBone. You know those AMD GCN based consoles, that all devs makes their engines for using async compute since day one? You overestimate the power (and money) of NVidia. Nvidia might have given enough "incentives" to make some devs use GameWorks in their PC ports, but that's about it. The backlash of GameWorks has gotten more and more devs to drop it and focus more on things they actually have power over. One such thing is Vulkan/DX12 with async compute.

 

Your conclusion is a simple fallacy you have nothing to back up.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Notional said:

So what you're saying is that you can't continue to switch between graphics vendors, because you are now invested in a Gsync monitor?

Now you know what vendor lock in is, and why it's anti competitive. What you are experiencing is called high switching costs, where it's not just a case of buying a new card, but having to replace other things as well, in this case a gsync monitor.

 

I've warned about this in here for over a year.

 

For the consumers sake, I hope NVidia drops the ball, trips and smashes their face into the ground trying to catch it. We need proper competition, and I truly hope AMD will gain huge market share this year and next. With DX12 and Vulkan, which AMD has been so active in shaping, I think that might very well be possible. With things like async compute, and potent hardware, that is not completely dependable on driver optimizations, we already know AMD cards perform for a long time. Hopefully NVidia users seeking to upgrade their 700 series cards are fed up with the planned obsolescence, making their premium prices hardware, crap after just a year or two.

Nope, I was merely pointing out I aint a fan boy, unlike some ........

 

If the performance difference at a given budget is worth losing Gsync I will buy an AMD GPU, gsync isn't that much of a deal breaker. Hell I could even sell the monitor to a fanboy and buy a freesync monitor with the cash!!

 

Ryzen Ram Guide

 

My Project Logs   Iced Blood    Temporal Snow    Temporal Snow Ryzen Refresh

 

CPU - Ryzen 1700 @ 4Ghz  Motherboard - Gigabyte AX370 Aorus Gaming 5   Ram - 16Gb GSkill Trident Z RGB 3200  GPU - Palit 1080GTX Gamerock Premium  Storage - Samsung XP941 256GB, Crucial MX300 525GB, Seagate Barracuda 1TB   PSU - Fractal Design Newton R3 1000W  Case - INWIN 303 White Display - Asus PG278Q Gsync 144hz 1440P

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, stealth80 said:

Nope, I was merely pointing out I aint a fan boy, unlike some ........

 

If the performance difference at a given budget is worth losing Gsync I will buy an AMD GPU, gsync isn't that much of a deal breaker. Hell I could even sell the monitor to a fanboy and buy a freesync monitor with the cash!!

Ah ok, that's fine.

My point is that a consumer should never be put in a situation, where they miss out on functionality on a third party piece of hardware, because they switch a first party one. After all, people tend to buy nice expensive monitors to last 4-5 years or so, where people might get 2 GPU's in that time frame.

 

Gsync was an amazing innovation, but it should have been an industry standard. Heck NVidia could even take a small licensing fee for it too. It should have been Adaptive Sync, but NVidia refused to license it. Now we have an industry standard and NVidia refuses to support it.

 

The problem still stands though. High switching costs will be an issue for some people.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Citadelen said:

Of course, I'd forgotten a 7970 was as good as the Fury X , damm, I could have saved LOADS of money.

/s

think hes pointing towards 7970 > 7970ghz edition > 280X, 290 > 390, 290X > 390X etc AMD appear to be rinsing the tech. Why didn't they just release the 390/390X when Maxwell hit instead of leaving it this late, and I know theres the "manufacturing process difficulties" and other things. It all leads to the conspiracy of tech companies already have the future tech but release it in upgradable chunks or they will kill the business. I'm not saying I believe this, but plenty do.

 

Nvidia aren't exempt either, its just been very obvious from AMD in the last few years

 

Ryzen Ram Guide

 

My Project Logs   Iced Blood    Temporal Snow    Temporal Snow Ryzen Refresh

 

CPU - Ryzen 1700 @ 4Ghz  Motherboard - Gigabyte AX370 Aorus Gaming 5   Ram - 16Gb GSkill Trident Z RGB 3200  GPU - Palit 1080GTX Gamerock Premium  Storage - Samsung XP941 256GB, Crucial MX300 525GB, Seagate Barracuda 1TB   PSU - Fractal Design Newton R3 1000W  Case - INWIN 303 White Display - Asus PG278Q Gsync 144hz 1440P

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, stealth80 said:

Nope, I was merely pointing out I aint a fan boy, unlike some ........

 

If the performance difference at a given budget is worth losing Gsync I will buy an AMD GPU, gsync isn't that much of a deal breaker. Hell I could even sell the monitor to a fanboy and buy a freesync monitor with the cash!!

Sell it to @i_build_nanosuits so he can have variable refresh rate for his $649 nosehair-trimmer 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, i_build_nanosuits said:

EDIT: ^^ i'm quite proud of that one actually, i think it should get a few like and a few thumbs up and maybe...well, you know :P that's how it is, that's how it goes...that's life...it's sucks, might not be the best for consumers, but that,s how it is...you can rally with the weak all you want, but i wont.

"rally with"

goddamn fanboyism. this shit is cancer. listen to yourself, ffs.

Link to comment
Share on other sites

Link to post
Share on other sites

All of this because of an unsubstantiated rumor. God damn WCCF.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, i_build_nanosuits said:

we are still FAR from such a scenario...nvidia still has the upper hand at every level.

Show me one benchmark that supports that claim?

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, i_build_nanosuits said:

well then there you go...my WHOLE point to this thread i guess was this:

AMD peasant, don't slap it too high when you see that nvidia cards does not support async compute well...because in the end NVIDIA HAS THE MONEY and the MARKET SHARE and game developers develop games to appeal to the MASSES...so you should be VERY sad if pascal is not good at Async compute cause that would mean that you won't see that many games taking full advantage of it. end of the story.

I'm not going to go into this whole Nvidia vs AMD thing that this thread has become but just to throw it out there: 

 

Regarding market share, both consoles use GCN and most AAA games are console ports which doesn't mean sunshine and rainbows for AMD and doom and gloom for Nvidia but keep that in mind.

 

When talking about rebranding, their rebranded cards are competitive so they had a luxury of doing that. Not that good for us consumers because progress slows down but Nvidia allowed them to do it so they are both at fault.

 

In the end, async compute looks to be a great thing and it would be most favorable if all companies supported it.

Link to comment
Share on other sites

Link to post
Share on other sites

@i_build_nanosuits

Nvidias voxel bbased light system in Rise of the Tomb Raider uses compute. And it's Nvidia That put it there. It was present even in the Xbox version. Not 100% sure of PS4.

 

Also. Most of the DX12_1 features that AMD doesn't have isn't very taxing to emulate on AMD hardware. According to the Tech Report video on dx12

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mister Snow said:

When talking about rebranding, their rebranded cards are competitive so they had a luxury of doing that. Not that good for us consumers because progress slows down but Nvidia allowed them to do it so they are both at fault.

 

In the end, async compute looks to be a great thing and it would be most favorable if all companies supported it.

To be fair it's mostly TSMC's fault for being incompetent with 20nm chips. :( Both AMD and NVidia suffered from that, but AMD probably suffered most. Seing 14nm FF going against 16nm ff is going to be very interesting.

 

And yeah Async compute is brilliant. Will be for NVidia too once they support it well.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, i_build_nanosuits said:

how relevant is a GPU when it's NOT directly supported by 90% of the game developers out there is the question YOU should ask yourself...and provide me with an answer.

...and don't come with the 970, it's a weak ass card with not enough compute cores that's why it's not as fast as a 300W radeon beast.

Which I've already noticed myself-its bloody useless outside gaming.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, i_build_nanosuits said:

Rise of the Tomb Raider if you havnt played that it's a must have IMHO. beautiful game, fun and entertaining, really well put togheter...will run well on a GTX 680 too.

http://www.metacritic.com/game/pc/rise-of-the-tomb-raider

 

doesnt interest me, im more into Shooters COD and those type of retarded shooters bore me and im really over modern shooters in general but im pretty sure Day of Defeat will be my go to FPS for all time and if im not playing that maybe a MMO but there are none i cant play the themepark gear tredmill games anymore so im thinking my gaming days are all but over.. 1 more upgrade will be it for me 

Processor: Intel core i7 930 @3.6  Mobo: Asus P6TSE  GPU: EVGA GTX 680 SC  RAM:12 GB G-skill Ripjaws 2133@1333  SSD: Intel 335 240gb  HDD: Seagate 500gb


Monitors: 2x Samsung 245B  Keyboard: Blackwidow Ultimate   Mouse: Zowie EC1 Evo   Mousepad: Goliathus Alpha  Headphones: MMX300  Case: Antec DF-85

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, stealth80 said:

-snip-

You gave the answer yourself, process advances enable higher clocks, at the same temperature and wattage, to name a few bonuses.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, stealth80 said:

think hes pointing towards 7970 > 7970ghz edition > 280X, 290 > 390, 290X > 390X etc AMD appear to be rinsing the tech. Why didn't they just release the 390/390X when Maxwell hit instead of leaving it this late, and I know theres the "manufacturing process difficulties" and other things. It all leads to the conspiracy of tech companies already have the future tech but release it in upgradable chunks or they will kill the business. I'm not saying I believe this, but plenty do.

 

Nvidia aren't exempt either, its just been very obvious from AMD in the last few years

Nvidia is certainly not exempt as they have done the same refinement rebrand shit from the GTX 400 to the GTX 700 series 

Link to comment
Share on other sites

Link to post
Share on other sites

I always love topics where it's about "nvidia can't this, amd can't do that" because people gone apeshit and berserk,

and I was like this behind my monitor:

 

popcorn-blank.gif

 

 

Is it too late for nGreedia to fix their AC bullshit or are they (deep) fucked?

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, i_build_nanosuits said:

maybe yes, but nvidia never released a GTX 880ti which would have been 100% the same as the GTX 780ti but with 6gb of video memory? have i missed something here?

Segmenting among the same family of GPU is one thing, you need to have faster and slower products to compete...but re-branding them generations after generations without ever ''REALLY'' improving on anything performance wise is another story.

Well, rebranding is another topic entirely... I thought we were talking about AMD's newer GPU designs just upping the core count instead of making any technological improvements. You're talking about some of AMD's newer products not using newer GPU designs at all (but pretending that they are, which is an annoying practice well worth discussing, but still a different topic). Yes, the 390 doesn't even use AMD's newer designs, so it's a bit dated in technology compared to the 970. But this thread is about AMD's newest architecture and how it will stack up to NVIDIA's. AMD's most current technology isn't that far behind NVIDIA's, even if not all of their current chips use that most current technology. The gap between AMD's most current stuff and NVIDIA's isn't as large as you make it out to be, so AMD's next architecture catching up with NVIDIA isn't that outrageous.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Notional said:

Gsync was an amazing innovation, but it should have been an industry standard. Heck NVidia could even take a small licensing fee for it too. It should have been Adaptive Sync, but NVidia refused to license it. Now we have an industry standard and NVidia refuses to support it.

Industry standard although it's only used by AMD? Hmm...not really. At least not for now. Intel might change things in the future though.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, i_build_nanosuits said:

2 FPS? 1216 cuda cores LESS on the 970 (-42%)...are you KIDDING ME son?! o.O

Yea, usually when you throw out the back seat of a car, it becomes lighter, faster and more fuel efficient. I mean you can only carry 2 people now, but hey. It's when you suddenly need to carry 2 or 3 extra people that you suddenly wish you still had your backseat. Dat efficiency:

Spoiler


index.php?ct=articles&action=file&id=207index.php?ct=articles&action=file&id=207

 

So again, what improvements are you talking about that Nvidia has done that AMD hasn't? I've already acknowledged AMD's inferior tessellation performance even if they improved it over GCN 1.1 and 1.0. I wanna know if I'm missing something.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

I called it.

 

14th of December, in fact.

 

 

BOW TO YOUR PROPHET

 

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm going to assume this information is false since nvidia isn't that stupid, plus it's very rare that tech rumors from amd or nvidia are true due to all the stupid fanboyism. Either way dx12 is still not relevant to me, and until a game I actually want to play come out as dx12 exclusive it will remain that way.

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×