Jump to content

AMD almost worth a quarter of what it paid for ATI

asim1999

A lot of the adaptive sync standard was done by them, the most basic underlying tech was already implemented in eDP but there were additions made. As for scalars and monitors, I'm not so sure it was need so much as wanting their focus to be on open vs closed. Why develop it in a closed fashion if you have no intention of barring anyone access to it?

 

But, that ignores that everyone is consorting on new tech. AMD and Hynix are two big names in HBM but not the only names and there were several next gen memory techs being worked on by different groups. Some dwarfing AMD massively.

 

My point is everyone is co-developing, even far bigger players like Intel, Samsung, and Microsoft. I cannot fault the little guy doing something no different than the big guys just because they NEED it more than the big guys do.

 

But, I CAN fault them, and do MASSIVELY, when they make massive asses of themselves by doing things like cutting corners with Bulldozer, CMT, as a tech, will probably never recover on the consumer side, and enterprise side may never see a better opportunity to develop implementations and AMD ruined it.

 

I'm pretty sure AMD did free sync the way they did because they didn't have the resources to develop their own in house scalar  like nvidia did, by pushing that onto existing scalar  manufacturers they didn't have to cough up more R+D cash. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

People the fury competes with the 980ti for $100 less...

Thats that. If you need to get in touch chances are you can find someone that knows me that can get in touch.

Link to comment
Share on other sites

Link to post
Share on other sites

People the fury competes with the 980ti for $100 less...

where you live, here it's the same price, sometimes more.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I'm pretty sure AMD did free sync the way they did because they didn't have the resources to develop their own in house scalar  like nvidia did, by pushing that onto existing scalar  manufacturers they didn't have to cough up more R+D cash. 

If they couldn't push adaptive sync into the DP standard they may well have done their own scalar, its a little beyond our information to say one way or the other since their push to have it added to the standard prevailed and their AS solution can now work across the spectrum without having to a) invest in their own proprietary scalar, b) force anyone to use said scalar, and c) pay for said scalars and charge more per unit due to that fact. They got the implementation they wanted added to every scalar. I believe for the NEXT iteration of DP its mandatory? Every single DP monitor will be able to do FS without ever having had to force a manufacturer to use a bespoke scalar or add one more drain of a manufacturing process to their ledger.

 

There really wasn't much of any R+D cash needed the underlying tech already existed, in eDP and going into full DP, it just needed to be assembled, connected, and standardized. AMD pushed, VESA relented, AS is now a part of the DP standard and any supported AS monitor will be able to use FS.

Link to comment
Share on other sites

Link to post
Share on other sites

That's going a bit far, but the reality is AMD is not in a position to compete, has terrible management levels between the C-Suite and Engineers which bogs down communication and adaptation, and they're hemmoraghing money like mad with a huge chunk of debt due at the opening week of 2019. Meanwhile they have no significant new products until mid-late 2016 and aren't gaining sales and market share ground in the meantime.

You're right, but just saying that every time there is bad news about AMD the phrase 'I think this signifies the death of AMD' gets used.
Link to comment
Share on other sites

Link to post
Share on other sites

If that is the case, then I think this would illustrate AMD's dilemma:sherman-farts-o.gif

Seems accurate

Link to comment
Share on other sites

Link to post
Share on other sites

You're right, but just saying that every time there is bad news about AMD the phrase 'I think this signifies the death of AMD' gets used.

Which is a fair observation, but I think most of the community realizes that AMD cannot compete purely by its financial position. At this point I think most of the community wishes AMD would split and that both major pieces were acquired by companies with money to burn. AMD's death would be best for consumers if it happened sooner rather than latter.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

TBH if thermaltake can make the same exact case and sell it cheaper, in a sort of way it means fractal was overcharging for it, not that I'm on TTs side but in the IT world its always about the best bang for buck.

Your in the wrong thread.

COMMUNITY STANDARDS   |   TECH NEWS POSTING GUIDELINES   |   FORUM STAFF

LTT Folding Users Tips, Tricks and FAQ   |   F@H & BOINC Badge Request   |   F@H Contribution    My Rig   |   Project Steamroller

I am a Moderator, but I am fallible. Discuss or debate with me as you will but please do not argue with me as that will get us nowhere.

 

Spoiler

  

 

Character is like a Tree and Reputation like its Shadow. The Shadow is what we think of it; The Tree is the Real thing.  ~ Abraham Lincoln

Reputation is a Lifetime to create but seconds to destroy.

You have enemies? Good. That means you've stood up for something, sometime in your life.  ~ Winston Churchill

Docendo discimus - "to teach is to learn"

 

 CHRISTIAN MEMBER 

 

 
 
 
 
 
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just thought I would leave this here.

 

AMD vs NVidia graph Market share.

 

Spoiler

06Qvn3i.jpg

 

How much Intel, Nvidia and AMD spend on research

Spoiler
1amd_nvidia_intel.png?resize=477%2C244

 

 

 

COMMUNITY STANDARDS   |   TECH NEWS POSTING GUIDELINES   |   FORUM STAFF

LTT Folding Users Tips, Tricks and FAQ   |   F@H & BOINC Badge Request   |   F@H Contribution    My Rig   |   Project Steamroller

I am a Moderator, but I am fallible. Discuss or debate with me as you will but please do not argue with me as that will get us nowhere.

 

Spoiler

  

 

Character is like a Tree and Reputation like its Shadow. The Shadow is what we think of it; The Tree is the Real thing.  ~ Abraham Lincoln

Reputation is a Lifetime to create but seconds to destroy.

You have enemies? Good. That means you've stood up for something, sometime in your life.  ~ Winston Churchill

Docendo discimus - "to teach is to learn"

 

 CHRISTIAN MEMBER 

 

 
 
 
 
 
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Which is a fair observation, but I think most of the community realizes that AMD cannot compete purely by its financial position. At this point I think most of the community wishes AMD would split and that both major pieces were acquired by companies with money to burn. AMD's death would be best for consumers if it happened sooner rather than latter.

 

Which community do you speak of that wants AMD to be broken up? And how would it be best for consumers for there to be less competition? Surely the more competion the better?

 

Even if AMD never regained the CPU performance lead it once had, Would much rather prefer a 3 way between AMD, Nvidia and Intel than just Nvidia and Intel going at each other, which is a scenario you throw out there quite a lot.

All AMD needs to do is make sure Zen is moderatly competative, which I really hope the do. If it is perhaps not as fast as Intel, but has the appropriate level of power usage and price, would be very tempted to replace my personal render farm with Zen 8 core cpus. They already have products and foundations for future GPU advancement with HBM 1 and 2 to really compete on a performance and by the looks, price point with Nvidia. I say HBM1 because I would suspect that HBM1 could very easily provide the memory and bandwidth for mid and low end cards. Hopefully by next year production has smoothed out.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm pretty sure AMD did free sync the way they did because they didn't have the resources to develop their own in house scalar  like nvidia did, by pushing that onto existing scalar  manufacturers they didn't have to cough up more R+D cash. 

 

It's called a scaler. It scales the image to the native resolution of the panel in the monitor and adds OSD + whatever colour/contrast settings set on the monitor itself. The scaler is the primary ASIC on the monitor controller, which is the logic board/pcb with the inputs in the monitor. Scalar is a mathematical function, but also used in other ways in physics and processing (not visual): https://en.wikipedia.org/wiki/Scalar

 

NVidia ended up spending a lot of money on making a monitor controller from the ground up with very limited functionality, instead of working with existing controller vendors. I'd prefer if NVidia had gone the Adaptive Sync route and made Gsync an industry standard. That way NVidia could ensure the proper VRR quality, but ensuring competition and wide adoption. Instead they ended up making yet another vendor lock in trap.

 

If they couldn't push adaptive sync into the DP standard they may well have done their own scalar, its a little beyond our information to say one way or the other since their push to have it added to the standard prevailed and their AS solution can now work across the spectrum without having to a) invest in their own proprietary scalar, B) force anyone to use said scalar, and c) pay for said scalars and charge more per unit due to that fact. They got the implementation they wanted added to every scalar. I believe for the NEXT iteration of DP its mandatory? Every single DP monitor will be able to do FS without ever having had to force a manufacturer to use a bespoke scalar or add one more drain of a manufacturing process to their ledger.

 

There really wasn't much of any R+D cash needed the underlying tech already existed, in eDP and going into full DP, it just needed to be assembled, connected, and standardized. AMD pushed, VESA relented, AS is now a part of the DP standard and any supported AS monitor will be able to use FS.

 

Actually both Adaptive Sync and Gsync are based on variable Vblank signals from eDP. Remember that AMD already know how to do eDP via their APU's in laptops, but AMD always focus on industry standards for wide adaptation. Also why invent something, others have already perfected? No better to co-develop the added function to existing controllers. Afaik AMD has talked about making Adaptive Sync for HDMI also, which could bring VRR to consoles and TV's.

As for variable vblank, that was only designed to hold an image by pausing the scalers scan frequency. It was never made to do variable refresh rate synced to the gpu. Both NVidia and AMD seems to have underestimated the work needed to make it work flawlessly. The problematic control of overdrive in Adaptive >Sync monitors and NVidia's 6 month delay proves this. However I'm sure the issues will be ironed out, but as much of the actual tech is in the freesync drivers, we see how complex this is. Especially if we want to factor in non full screen applications, which requires drivers to take over the built in Windows "window" manager (dwm.exe which controls the gui of windows).

 

How much Intel, Nvidia and AMD spend on research

1amd_nvidia_intel.png?resize=477%2C244

 

Bear in mind that Intel's much higher R&D is directly connected to their foundry. AMD and NVidia does not have this massive R&D expense, as they use Global Foundries and TSMC.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia is a juggernaut with money. AMD can't do shit for R&D, that's why the past few generations of cards have been jokes. Nvidia has been ahead for years now. The only reason AMD is undercutting Nvidia in the low-end segment is so they can move any cards. Not because they're good.

 

 

the 7xxx series cards and 290 series and fury series cards are not jokes.  They were competitive cards for what they were, the reason nvidia did so much better was because of the outsized brand affinity that extends over and above raw performance.  Case in point, you think all those cards are jokes which highlights your own poisoned views against amd and for nvidia.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

Which is a fair observation, but I think most of the community realizes that AMD cannot compete purely by its financial position. At this point I think most of the community wishes AMD would split and that both major pieces were acquired by companies with money to burn. AMD's death would be best for consumers if it happened sooner rather than latter.

 

 

 

amd's death would mean the cpu arm might get scuttled due to the licensing deals with intel and not being able to sell that business if amd is owned by another company, the gpu business might get a shot in the arm and some inflow of cash, but where does that leave the apus that are a combination of both?  Ditch it all and just leave 100% of the x86 market to intel and let a cash infused ati battle over the leftover scraps for the discreet gpu market with nvidia?

 

That's your "better" scenario for consumers?  Your view is absolutely cracked and makes zero sense.  It only makes sense if it stems from a person that just despises the company and wants to see them burn.  Just to spite you, I hope they pull through.  They may still fail, but if all the smug swarms of haters were proven wrong it the comeback would taste so sweet.  It's like showing up alive and well after the antagonist was whispering in your ear when you were in a bad place on deaths door to just off and kill yourself.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

amd's death would mean the cpu arm might get scuttled due to the licensing deals with intel and not being able to sell that business if amd is owned by another company, the gpu business might get a shot in the arm and some inflow of cash, but where does that leave the apus that are a combination of both?  Ditch it all and just leave 100% of the x86 market to intel and let a cash infused ati battle over the leftover scraps for the discreet gpu market with nvidia?

 

That's your "better" scenario for consumers?  Your view is absolutely cracked and makes zero sense.  It only makes sense if it stems from a person that just despises the company and wants to see them burn.  Just to spite you, I hope they pull through.  They may still fail, but if all the smug swarms of haters were proven wrong it the comeback would taste so sweet.  It's like showing up alive and well after the antagonist was whispering in your ear when you were in a bad place on deaths door to just off and kill yourself.

 

Although it's correct that the x86 license would be rendered void if AMD's CPU division where to be sold into another company, you have to remember that it is a cross license between Intel and AMD. Intel's original x64 processor was called Itanium and it flopped hard on the market. All x64 functionality on the desktop market today is based off of AMD's x64 IP tech. So if the cross license is rendered void, Intel would lose all rights to produce x64 functionality in their processors, and would either be forced to only make 32 bit processors again or renegotiate the cross license immediately.

 

Giving Intel monopoly of the x86 market would be a disaster. It's already bad enough. The problem is that people and media are talking AMD way further down than they really are. For instance as you mention, they are very much competitive on the GPU market, even having the koth card with 7970. But people forget quickly, and marketing team at NVidia are efficient to say the least. That combined with their strategy of proprietary tech, that forces vendor lock in, has proved to be a strong cocktail in NVidia.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Press F to pay respects.

F

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

amd's death would mean the cpu arm might get scuttled due to the licensing deals with intel and not being able to sell that business if amd is owned by another company, the gpu business might get a shot in the arm and some inflow of cash, but where does that leave the apus that are a combination of both?  Ditch it all and just leave 100% of the x86 market to intel and let a cash infused ati battle over the leftover scraps for the discreet gpu market with nvidia?

The cpu division would most certainly not be scuttled since Intel needs a competitor or they will have hell to pay with antitrust laws.

CPU: Intel i7 - 5820k @ 4.5GHz, Cooler: Corsair H80i, Motherboard: MSI X99S Gaming 7, RAM: Corsair Vengeance LPX 32GB DDR4 2666MHz CL16,

GPU: ASUS GTX 980 Strix, Case: Corsair 900D, PSU: Corsair AX860i 860W, Keyboard: Logitech G19, Mouse: Corsair M95, Storage: Intel 730 Series 480GB SSD, WD 1.5TB Black

Display: BenQ XL2730Z 2560x1440 144Hz

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

 

I had heard there were plans to implement VRR in HDMI, I had not heard much other than it would be an HDMI 2+ feature, but would possibly be a firmware upgradeable feature at that time?

 

It would be nice to expand it, but HDMI does have some hurdles of not being open like DP, if memory serves.

Link to comment
Share on other sites

Link to post
Share on other sites

Which community do you speak of that wants AMD to be broken up? And how would it be best for consumers for there to be less competition? Surely the more competion the better?

Even if AMD never regained the CPU performance lead it once had, Would much rather prefer a 3 way between AMD, Nvidia and Intel than just Nvidia and Intel going at each other, which is a scenario you throw out there quite a lot.

All AMD needs to do is make sure Zen is moderatly competative, which I really hope the do. If it is perhaps not as fast as Intel, but has the appropriate level of power usage and price, would be very tempted to replace my personal render farm with Zen 8 core cpus. They already have products and foundations for future GPU advancement with HBM 1 and 2 to really compete on a performance and by the looks, price point with Nvidia. I say HBM1 because I would suspect that HBM1 could very easily provide the memory and bandwidth for mid and low end cards. Hopefully by next year production has smoothed out.

My proposal is let Intel and Nvidia have the unlike pieces of AMD and let those two compete in both arenas. The current arrangement means less competition. Without a lot more money for R&D, much of AMD's IP goes unused or prematurely deployed. With AMD trying to compete against two financial juggernauts in two very different arenas with the monolithic management system they use, it's no wonder they have such long periods between product launches, refreshes, etc..

It would mean more competition for AMD to be broken up. Intel's major GPU problem has always been a lack of IP. Patent laws very much get in the way of competition in this case. And Nvidia has expressed desires to work with x86 as well. Both of them are chomping at the bit to fight on both sides. The problem is AMD is in the way. For Intel, it's better to kill and absorb Nvidia and then drive AMD into the floor permanently. If that happens all competition potential disappears, so it's really best if AMD went down now.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

amd's death would mean the cpu arm might get scuttled due to the licensing deals with intel and not being able to sell that business if amd is owned by another company, the gpu business might get a shot in the arm and some inflow of cash, but where does that leave the apus that are a combination of both? Ditch it all and just leave 100% of the x86 market to intel and let a cash infused ati battle over the leftover scraps for the discreet gpu market with nvidia?

That's your "better" scenario for consumers? Your view is absolutely cracked and makes zero sense. It only makes sense if it stems from a person that just despises the company and wants to see them burn. Just to spite you, I hope they pull through. They may still fail, but if all the smug swarms of haters were proven wrong it the comeback would taste so sweet. It's like showing up alive and well after the antagonist was whispering in your ear when you were in a bad place on deaths door to just off and kill yourself.

The FTC would force Intel to sell x86 to at least one more big competitor. They could force it to sell to Nvidia which has expressed many times it would like to go into x86. As per the APU thing, Nvidia and Intel are both already making them. HSA may die, but we already have OpenMP, OpenCL, OpenACC, and CUDA. It's not as though heterogeneous integration will just stop because HSA goes up in flames. HSA is a poorly designed programming system anyway. I don't see it remotely taking off.

Let two titans fight in both arenas, let prices fall, let R&D money burn, let integration be driven by two companies with the ubiquity to make it happen. Just let the elephant in the room go. AMD has no realistic hope of recovery. Zen is going up against fully custom Skylake Xeons in an arena that has been Intel for so long all the major security and diagnostic suites have been written with vPro in mind for years. Zen is trying to enter that arena first instead of proving itself on the consumer side first. Strategically they need to beat Skylake to market to have a chance. Beyond that they either have to deliver astonishing performance at the same voltage and heat or provide equal performance to Intel at lower voltage and heat while also being price competitive, and it's not as though Intel's unwilling to go into a price war to defend its most lucrative turf. Beyond that AMD can't be as cheap given GloFo is just a more expensive foundry all around. In GPUs they're already doing everything wrong. The 300 series should have been GCN 1.2 across the board. Refreshing 4 year old chips is unacceptable to anyone sane. They chased HBM and let Hynix burn most of their R&D money on buying votes to keep Hybrid Memory Cube out of the JEDEC standard despite the fact it's a superior memory. HMC is 4x denser at 4GB per stack on a 2.5D interposer, and it has the same bandwidth (quoting specs from the most recent Knight's Landing articles, and the new Xeon Phi launch in just a couple months with the socketed platform coming later). AMD has a death wish. Just let it go and let competition not only resume but go into overdrive.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

My proposal is let Intel and Nvidia have the unlike pieces of AMD and let those two compete in both arenas. The current arrangement means less competition. Without a lot more money for R&D, much of AMD's IP goes unused or prematurely deployed. With AMD trying to compete against two financial juggernauts in two very different arenas with the monolithic management system they use, it's no wonder they have such long periods between product launches, refreshes, etc..

It would mean more competition for AMD to be broken up. Intel's major GPU problem has always been a lack of IP. Patent laws very much get in the way of competition in this case. And Nvidia has expressed desires to work with x86 as well. Both of them are chomping at the bit to fight on both sides. The problem is AMD is in the way. For Intel, it's better to kill and absorb Nvidia and then drive AMD into the floor permanently. If that happens all competition potential disappears, so it's really best if AMD went down now.

 

 

See, I could agree with you, but even with their tiny R&D AMD is able to make products that offer great competition to Nvidia. As long as they are able to do that, they are providing more competition in the market place. Fury/X and 300 are putting pressure on the Nvidia counterparts, why remove that kind of competition?

 

 

 

The FTC would force Intel to sell x86 to at least one more big competitor. They could force it to sell to Nvidia which has expressed many times it would like to go into x86. As per the APU thing, Nvidia and Intel are both already making them. HSA may die, but we already have OpenMP, OpenCL, OpenACC, and CUDA. It's not as though heterogeneous integration will just stop because HSA goes up in flames. HSA is a poorly designed programming system anyway. I don't see it remotely taking off.

Let two titans fight in both arenas, let prices fall, let R&D money burn, let integration be driven by two companies with the ubiquity to make it happen. Just let the elephant in the room go. AMD has no realistic hope of recovery. Zen is going up against fully custom Skylake Xeons in an arena that has been Intel for so long all the major security and diagnostic suites have been written with vPro in mind for years. Zen is trying to enter that arena first instead of proving itself on the consumer side first. Strategically they need to beat Skylake to market to have a chance. Beyond that they either have to deliver astonishing performance at the same voltage and heat or provide equal performance to Intel at lower voltage and heat while also being price competitive, and it's not as though Intel's unwilling to go into a price war to defend its most lucrative turf. Beyond that AMD can't be as cheap given GloFo is just a more expensive foundry all around. In GPUs they're already doing everything wrong. The 300 series should have been GCN 1.2 across the board. Refreshing 4 year old chips is unacceptable to anyone sane. They chased HBM and let Hynix burn most of their R&D money on buying votes to keep Hybrid Memory Cube out of the JEDEC standard despite the fact it's a superior memory. HMC is 4x denser at 4GB per stack on a 2.5D interposer, and it has the same bandwidth (quoting specs from the most recent Knight's Landing articles, and the new Xeon Phi launch in just a couple months with the socketed platform coming later). AMD has a death wish. Just let it go and let competition not only resume but go into overdrive.

 

Why is it "unacceptable for anyone sane" that AMD does what they do? You mean have the technical knowledge to refine their GPUs to be competative for multiple generations of cards? Is Nvidia really doing it the right way? They put out new GPU architectures, but performance wise they are not any further ahead.

Link to comment
Share on other sites

Link to post
Share on other sites

where you live, here it's the same price, sometimes more.

thats the fury x your thinking of.

Thats that. If you need to get in touch chances are you can find someone that knows me that can get in touch.

Link to comment
Share on other sites

Link to post
Share on other sites

See, I could agree with you, but even with their tiny R&D AMD is able to make products that offer great competition to Nvidia. As long as they are able to do that, they are providing more competition in the market place. Fury/X and 300 are putting pressure on the Nvidia counterparts, why remove that kind of competition?

Why is it "unacceptable for anyone sane" that AMD does what they do? You mean have the technical knowledge to refine their GPUs to be competative for multiple generations of cards? Is Nvidia really doing it the right way? They put out new GPU architectures, but performance wise they are not any further ahead.

The 300 series is putting pressure on Nvidia? Market data doesn't show that. Projections leave the marketshare split the same 25-75 it was before the Titan X, 980TI, and 300 series arrived. And yes, Nvidia is doing it the right way. Having an entire lineup that can use G-Sync is a good thing. For AMD to not have the entire 300 series be able to use Free Sync is only shooting themselves in the foot. AMD gives no incentive for people to buy. Nvidia gives plenty, and that's before they stop pumping resources into driver support for older generations of cards. AMD doesn't capitalize on what they can do, and what they can do has been disappointing of late. Fiji is a disaster. It's unbalanced, underperforming, wasteful, and inflexible. They should have put more ROPs and other engine resources and cut back the number of Stream Processors. That's been the conclusion by Anandtech and others. AMD is not being competitive at all. It's time for the old dinosaur to retire.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

The gamble is real ... AMD is throwing all they got into this final move , will they succeed ? I hope they do ... If not , it's game over for AMD ... 

... Life is a game and the checkpoints are your birthday , you will face challenges where you may not get rewarded afterwords but those are the challenges that help you improve yourself . Always live for tomorrow because you may never know when your game will be over ... I'm totally not going insane in anyway , shape or form ... I just have broken English and an open mind ... 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×