Jump to content

(Updated) AMD Navi GPU to Offer GTX 1080 Class Performance at ~$250 Report Claims

Ryujin2003

If the die stacking via infinity fabric thing is not ready for 2019 GPUs then that is really bad news for RTG!

 

This was supposed to be the big draw of Navi. Scalability! The need to no longer clocks GPUs to the limit, instead going very wide via multiple dies each one clocked in the efficient range and also getting around the current GCN 1.2 bottlenecks in geometry performance, while avoiding the yield issues of big monolithic dies. Navi in 2019 was supposed to be the threadripper  of GPUs.

 

So if this report is true and the big powerful die stacked Navi is postponed until 2020 due to Engineering issues: That means a whole extra year before AMD can start competing with Nvidia's top end once again. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Humbug said:

If the die stacking via infinity fabric thing is not ready for 2019 GPUs then that is really bad news for RTG!

 

This was supposed to be the big draw of Navi. Scalability! The need to no longer clocks GPUs to the limit, instead going very wide via multiple dies each one clocked in the efficient range and also getting around the current GCN 1.2 bottlenecks in geometry performance, while avoiding the yield issues of big monolithic dies. Navi in 2019 was supposed to be the threadripper  of GPUs.

 

So if this report is true and the big powerful die stacked Navi is postponed until 2020 due to Engineering issues: That means a whole extra year before AMD can start competing with Nvidia's top end once again. 

Even if die stacking works with Navi, that won't change the need for a totally different graphics pipeline and that would also mean new API's.  Graphics are bound by throughput, the more you can do and faster you can push data through the faster you processor will perform.  You can't introduce the types of latency infinity fabric introduces for inter die communication unless the software is there to circumvent that problem.

 

Geometry bottleneck is one of the key areas AMD has issues with, the other is raw shader throughput.

 

Geometry is easy to see the pipeline stalls if the GS units are overtasked, because its a fixed function and the pipeline stages are linear.  So the amount of GS units a chip has is what it can do at a certain frequency.  No way around this currently with current API's. 

 

Vega's primitive shaders were made so programmers could use them for GS calculations on the shader array.  Albeit this has never come to fruition.  We can surmise from AMD's direction on this, they couldn't change the amount of GS units, most likely due to die size constraints since they were able to change the amount of units from previous GCN architectures, so they seem not to be tied to any other unit.

 

Raw shader throughput, is a bigger task to deal with, this requires the plumbing of the chip to be redone.  Simpler compiler level needs is not easy to do with complex shader programs.  We are looking at shader programs that are now larger then some engines from the first gen of 3d graphics lol.

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Razor01 said:

For now that doesn't seem to be the case as Vega 7 will remain the sole high end compute chip.

Indeed, but remember Vega is getting a 12nm refresh which is exclusive to the professional Instinct line. Navi won't be out until 2019 anyways, and I'm sure Vega will do well there on 12nm. Heck, it might even get a 7 nm refresh later on, but it depends if Navi can be scaled for compute also.

 

As for the R&D on Navi. Well, they can still put more into creating more chips. And definitely do more to make intrinsic packed math actually work, and all the other stuff Vega has, that either doesn't work, or isn't implemented for the lack of funding. I hope Navi will be functional. But honestly, with Fury and Vega being underwhelming HBM based gaming GPU, I don't have huge hopes.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, M.A.P said:

Wow! GTX 1080 performance in 2019xD. Nvidia is going to catch up too LOL. It is like they are comparing 1080ti with 780ti(which is equivalent to GTX 1060) LOL. 

How you can expect that it is a good deal? As they are planning to launch in 2019 not now>:( .Don't get excited guys the competition will still be like RX 480/580 vs GTX 1060.

 

 

1080 performance for $250 is a good deal. No other way to slice this if it’s real. Though I doubt it is, this same hype was around the RX 480. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Notional said:

Indeed, but remember Vega is getting a 12nm refresh which is exclusive to the professional Instinct line. Navi won't be out until 2019 anyways, and I'm sure Vega will do well there on 12nm. Heck, it might even get a 7 nm refresh later on, but it depends if Navi can be scaled for compute also.

 

As for the R&D on Navi. Well, they can still put more into creating more chips. And definitely do more to make intrinsic packed math actually work, and all the other stuff Vega has, that either doesn't work, or isn't implemented for the lack of funding. I hope Navi will be functional. But honestly, with Fury and Vega being underwhelming HBM based gaming GPU, I don't have huge hopes.

RPM is only part of the issue. 

 

Look at the g80 and compare that to Fermi, and Fermi to what Kepler was and then Kepler to Maxwell and Pascal, we can see how nV optimized their chips for ease for programmer use.  The same code just runs better and faster on newer generation of cards even with the same number of units that are capable of doing the same number of calculations at the same frequency.  This isn't IPC, which is what RPM gives. doing additional calculations at lower precision at the same time.

 

This is throughput.  Doing the same code the same calculations just more of it over a period of time.  Its like taking a 1 inch diameter pipe vs a 2 inch diameter pipe.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Razor01 said:

RPM is only part of the issue. 

 

Look at the g80 and compare that to Fermi, and Fermi to what Kepler was and then Kepler to Maxwell and Pascal, we can see how nV optimized their chips for ease for programmer use.  The same code just runs better and faster on newer generation of cards even with the same number of units that are capable of doing the same calculations at the same frequency.  This isn't IPC, which is what RPM gives. doing additional calculations at lower precision at the same time.

 

This is throughput.  Doing the same code the same calculations just more of it over a period of time.  Its like taking a 1 inch diameter pipe vs a 2 inch diameter pipe.

Indeed, and simply too many idle parts on the GCN architecture (which is also what RPM tries to negate), but also bottlenecks within it (like the 64 ROPS and so on).

 

I am very interested in what Navi brings. Not because I'm hyped, because Fury and Vega taught us not to. But because we might actually see a lot of new technology. Like the Vega stuff implemented to actually work for once. Maybe we will see som infinity fabric usage, or games using the advanced part of the vega memory controller?! 

 

Either way, if Intel suddenly found the patents to make actual dedicated gaming GPU's, then RTG really needs to get their foot out their donkey, so they won't be crushed between a Crimelord and a Scumbag.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Notional said:

Indeed, and simply too many idle parts on the GCN architecture (which is also what RPM tries to negate), but also bottlenecks within it (like the 64 ROPS and so on).

 

I am very interested in what Navi brings. Not because I'm hyped, because Fury and Vega taught us not to. But because we might actually see a lot of new technology. Like the Vega stuff implemented to actually work for once. Maybe we will see som infinity fabric usage, or games using the advanced part of the vega memory controller?! 

 

Either way, if Intel suddenly found the patents to make actual dedicated gaming GPU's, then RTG really needs to get their foot out their donkey, so they won't be crushed between a Crimelord and a Scumbag.

The memory controller is interesting.  But personally from what we have seen with Volta and its global caching, its seems like nV has something similar, we don't have details on it yet, but from the surface it looks quite close.  Pro pascal chips show this ability too but we can't see it on gaming cards since the lack of HBM, it could be something to do with the type of memory and the bandwidth needed to sustain a global cache at that level.  The inclusion of nV's nvlink also kinda points to this too.

Link to comment
Share on other sites

Link to post
Share on other sites

So, a 2019 card being being equal to a 2016 card.

If NVidia falls dead and does not do anything until then, AMD will have a big winner!

 

Luckily NVidia won't do that and consumers can enjoy better performance by 2019.

But hey, AMD will have a nice card for 250 bucks that is awesome for the higher lowend by then!

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Notional said:

Indeed, and simply too many idle parts on the GCN architecture (which is also what RPM tries to negate), but also bottlenecks within it (like the 64 ROPS and so on).

I'm wondering how much GCN's 64 threads per wavefront is an issue compared to NVIDIA's 32 threads per warp. All I know is if you don't feed a wavefront with 64 threads, you're going to have idle GPU resources.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, DrMacintosh said:

1080 performance for $250 is a good deal. No other way to slice this if it’s real. Though I doubt it is, this same hype was around the RX 480. 

It is a good deal today.... 

 

And seriously though... People seem to underestimate how much the overall performance crown matters to many consumers that don't know better.

 

I know plenty of people who go for intel CPUs because "intel makes the best cpus", which isn't wrong. But when I tell them that AMD's offerings at that price are as good or better, they say stuff like they prefer to buy from the best. GPUs are the same way. 

 

Hell it's the same thing for cars tbh. Halo products push market perceptions much more than any other segment. And market perception, not actual performance, pushes most of the sales to consumers (though not to miners heh)

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Do not forget about the miners though, they will fuck it up again so still no hope.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, M.Yurizaki said:

I'm wondering how much GCN's 64 threads per wavefront is an issue compared to NVIDIA's 32 threads per warp. All I know is if you don't feed a wavefront with 64 threads, you're going to have idle GPU resources.

I honestly don't know. I really don't get why it's designed like that, but it must be a fundamental issue with GCN, since it's been like this since the 290 series.

2 minutes ago, Curufinwe_wins said:

It is a good deal today.... 

 

And seriously though... People seem to underestimate how much the overall performance crown matters to many consumers that don't know better.

 

I know plenty of people who go for intel CPUs because "intel makes the best cpus", which isn't wrong. But when I tell them that AMD's offerings at that price are as good or better, they say stuff like they prefer to buy from the best. GPUs are the same way. 

 

Hell it's the same thing for cars tbh. Halo products push market perceptions much more than any other segment.

Indeed. The KOTH crown has way too much influence on buyers choice. A 1080ti being a great GPU, performance wise, has no impact on the performance of the 1060 cards.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Nice joke

This is what happend with Vega 56/64 launch "Much better performance than the GTX 1080 Ti for the same price!"

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Curufinwe_wins said:

I know plenty of people who go for intel CPUs because "intel makes the best cpus", which isn't wrong. But when I tell them that AMD's offerings at that price are as good or better, they say stuff like they prefer to buy from the best. GPUs are the same way. 

While there are many people that do follow that mentality, there are also a good chunk of the consumers that look at the price/performance especially those on a budget which is AMDs bread and butter. 

 

Budget cards make AMD money and the RX 480/580 have been a massive success for them. Really AMD became better than Nvidia imo ever since Radeon Crimson and now with Radeon ReLive I don't even consider Nvidia since I am a budget consumer. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Notional said:

I honestly don't know. I really don't get why it's designed like that, but it must be a fundamental issue with GCN, since it's been like this since the 290 series.

Indeed. The KOTH crown has way too much influence on buyers choice. A 1080ti being a great GPU, performance wise, has no impact on the performance of the 1060 cards.

 

Decreasing the thread amounts per wavefront will require quite a bit more cache and control silicon.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, DrMacintosh said:

While there are many people that do follow that mentality, there are also a good chunk of the consumers that look at the price/performance especially those on a budget which is AMDs bread and butter. 

 

Budget cards make AMD money and the RX 480/580 have been a massive success for them. Really AMD became better than Nvidia imo ever since Radeon Crimson and now with Radeon ReLive I don't even consider Nvidia since I am a budget consumer. 

What? Are you talking about?

 

AMD doesn't have a price to performance lead anywhere in there stack right now due to crypto.

 

image.thumb.png.4031c79c1533f11f229faec280b903ed.png

 

image.thumb.png.bc3971da60c8d08f1520ec4ce3e0be91.png

 

And their software isn't good either... I honestly don't know how anyone could count reLive as a plus for AMD by comparison.

 

Now it isn't AMD's fault that their cards suck (relatively speaking) at gaming, and are great (relatively speaking) at compute... but that's how AMD is making money right now.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, DrMacintosh said:

While there are many people that do follow that mentality, there are also a good chunk of the consumers that look at the price/performance especially those on a budget which is AMDs bread and butter. 

 

Budget cards make AMD money and the RX 480/580 have been a massive success for them. Really AMD became better than Nvidia imo ever since Radeon Crimson and now with Radeon ReLive I don't even consider Nvidia since I am a budget consumer. 

 

not enough, the Performance segment *high end* (not the enthusiast) sell as many cards as the midrange and low end together, put in the margins of that segment with those volume sales, its a lot money, its 50% of the entire market volume wise but more like 75% of total money in the market.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Curufinwe_wins said:

AMD doesn't have a price to performance lead anywhere in there stack right now due to crypto.

Neither does Nvidia, all cards are effected. 

 

AMDs MSRP for the RX series of cards makes them the best options imo. While prices are inflated, that is not representative of AMDs pricing model since they have 0 control over what retailers charge. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, DrMacintosh said:

Neither does Nvidia, all cards are effected. 

 

AMDs MSRP for the RX series of cards makes them the best options imo. While prices are inflated, that is not representative of AMDs pricing model since they have 0 control over what retailers charge. 

 

Neither does Nvidia, except everywhere in the stack at current prices (1050 currently same price with slightly better performance than the non-cutdown 560, 1060 lower price with better performance than 580... list goes on).

 

You can't look at this stuff in a vacuum. Nvidia could come out tomorrow and claim a 400 dollar MSRP for their 1080ti and nothing changes.

 

Like in seriousness. Have you ever looked at how meaningless MSRPs are for SSDs for example? Almost always way higher than what the market allows charging them for.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, DrMacintosh said:

Neither does Nvidia, all cards are effected. 

 

AMDs MSRP for the RX series of cards makes them the best options imo. While prices are inflated, that is not representative of AMDs pricing model since they have 0 control over what retailers charge. 

 

We aren't doing any favors for AMD buying their lower end tier cards, its a slow death.  Polaris when launched had margins around 30%, its great to keep AMD on going as they are, but not to expand. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Razor01 said:

 

not enough, the Performance segment *high end* (not the enthusiast) sell as many cards as the midrange and low end together, put in the margins of that segment with those volume sales, its a lot money, its 50% of the entire market volume wise but more like 75% of total money in the market.

And? AMD is surviving with their current market. They can't go into the high end without having capital. 

 

AMD would love to be in the high end, but the reality of AMDs situation prevents them from doing that often and well while they spend their R&D on mid range cards which AMD knows will sell. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Razor01 said:

 

We aren't doing any favors for AMD buying their lower end tier cards, its a slow death.  Polaris when launched had margins around 30%, its great to keep AMD on going as they are, but not to expand. 

They are expanding. The Radeon Graphics division is operating in the green. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

Just now, DrMacintosh said:

And? AMD is surviving with their current market. They can't go into the high end without having capital. 

 

AMD would love to be in the high end, but the reality of AMDs situation prevents them from doing that often and well while they spend their R&D on mid range cards which AMD knows will sell. 

Survival is a relative term, survival doesn't give them increase. They need to be able to sustain a two front war that means they need to increase both sides currently not just one or the other.  Intel which has gobs of cash to through at R&D if they wish to, and nV which has an R&D budget for GPU's higher than that of both the CPU and GPU divisions of AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Curufinwe_wins said:

What? Are you talking about?

 

AMD doesn't have a price to performance lead anywhere in there stack right now due to crypto.

 

And their software isn't good either... I honestly don't know how anyone could count reLive as a plus for AMD by comparison.

 

Now it isn't AMD's fault that their cards suck (relatively speaking) at gaming, and are great (relatively speaking) at compute... but that's how AMD is making money right now.

That depends on your localization. A 580 can definitely throw some punches against the 1060, no problem. And the price difference varies depending on where you live. But as you stated it's because of crypto, not MSRP. So vendors demand those prices, because they sell all the AMD cards anyways. You can't buy and AMD card, if it's sold out after all.

 

As for the software, I completely disagree. It looks great, it's fast and snappy, and has loads of features and things to tinker with. Software is one of the places AMD is superior to Nvidia.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Razor01 said:

They need to be able to sustain a two front war that means they need to increase both sides currently not just one or the other. 

No, they don't. AMD is not currently able to compete at the high end. Wanting them to do so just means that AMDs cards will be bad across the board and would be a terrible business decision. 

 

They have to be good at the midrange cards before they can be good at making High end cards. R&D money does not come out of nowhere. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×