Jump to content

AMD almost worth a quarter of what it paid for ATI

asim1999

The 300 series is putting pressure on Nvidia? Market data doesn't show that. Projections leave the marketshare split the same 25-75 it was before the Titan X, 980TI, and 300 series arrived. And yes, Nvidia is doing it the right way. Having an entire lineup that can use G-Sync is a good thing. For AMD to not have the entire 300 series be able to use Free Sync is only shooting themselves in the foot. AMD gives no incentive for people to buy. Nvidia gives plenty, and that's before they stop pumping resources into driver support for older generations of cards. AMD doesn't capitalize on what they can do, and what they can do has been disappointing of late. Fiji is a disaster. It's unbalanced, underperforming, wasteful, and inflexible. They should have put more ROPs and other engine resources and cut back the number of Stream Processors. That's been the conclusion by Anandtech and others. AMD is not being competitive at all. It's time for the old dinosaur to retire.

 

Market data? No we are talking about AMDs ability to make good products, their ability to market them and raise awarness is....another topic, which I would agree they aren't as good as Nvidia.

While I can agree with you that it would be good to have feature ubiquity across entire series of cards, like G-sync, how actually useful is that?

If some one made a topic here...

 

"Hey, I'm buying a 750ti / 960 (maybe even 970) and want to get a G-sync montior, what one do you recommend?"

 

No one is going to answer with a monitor model. The majority would suggest that there would be much greater value buying a more powerful GPU and sticking to the montior they have. It doesnt make sense to spend 2 to 3 times as much on the monitor as you are to the GPU, you might want to really look at where you are putting your money. Even spending the same amount on the GPU as the monitor, logically would be questionable.

As most main stream GPUs can handle 1080p60, the only time G-sync/free sync value come into play is when people are trying to play above 2.5k to 4k. People doing this are very much on the enthusiest level who are not trying to drive a G-sync monitor with a low spec GPU. Would hardly say its AMD shooting themselves in the foot not having freesync on all cards.

300 series putting pressure on Nvidia? Yes

Google 390x vs 980, you see pleanty of articles and reviews showing the 390x keeping pace and exceeding the 980 in many benchmarks. What that does also infer is a well binned slightly overclocked "revised" 290x can beat a 980 for cheaper and double the Vram. Games might not use that RAM but there are programes that do. Chuck in the much higher openCL performance and you get a small collection of performance and value incentives that could catch a few peoples eye.

 

"Fiji is a disaster. It's unbalanced, underperforming, wasteful, and inflexible."

Why? Is it because now they have a product that can trade blows with the 980ti/TitanX, which they did not have before? It has on par sometimes greater 4k performance, despite its smaller amount of onboard memory? Small form factor that could allow it to get into some really small cases? The main issue I see is that they can't make enough of them.

Yes AMD could to better, but so could Nvidia and Intel for how many more resources they have. Why no anger towards the other two?

Link to comment
Share on other sites

Link to post
Share on other sites

Lol, no. It's known that Nvidia has held back their cards just to be on par with AMD so they don't just kill them off, and the 680 topped the 7970 in most cases too.. Also, the original Titan was supposed to have HBM but it wasn't even close to being ready for release.

the 7970 ghz edition bested every 680 on the market. The 7990 was the fastest graphics card on the planet. The 290x is on par with the 780ti, and the r9 295x2 is still the fastest graphics card on the planet. HBM was developed by AMD, why the hell would nvidia have it before AMD? 

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

Market data? No we are talking about AMDs ability to make good products, their ability to market them and raise awarness is....another topic, which I would agree they aren't as good as Nvidia.

While I can agree with you that it would be good to have feature ubiquity across entire series of cards, like G-sync, how actually useful is that?

If some one made a topic here...

 

"Hey, I'm buying a 750ti / 960 (maybe even 970) and want to get a G-sync montior, what one do you recommend?"

 

No one is going to answer with a monitor model. The majority would suggest that there would be much greater value buying a more powerful GPU and sticking to the montior they have. It doesnt make sense to spend 2 to 3 times as much on the monitor as you are to the GPU, you might want to really look at where you are putting your money. Even spending the same amount on the GPU as the monitor, logically would be questionable.

As most main stream GPUs can handle 1080p60, the only time G-sync/free sync value come into play is when people are trying to play above 2.5k to 4k. People doing this are very much on the enthusiest level who are not trying to drive a G-sync monitor with a low spec GPU. Would hardly say its AMD shooting themselves in the foot not having freesync on all cards.

300 series putting pressure on Nvidia? Yes

Google 390x vs 980, you see pleanty of articles and reviews showing the 390x keeping pace and exceeding the 980 in many benchmarks. What that does also infer is a well binned slightly overclocked "revised" 290x can beat a 980 for cheaper and double the Vram. Games might not use that RAM but there are programes that do. Chuck in the much higher openCL performance and you get a small collection of performance and value incentives that could catch a few peoples eye.

 

"Fiji is a disaster. It's unbalanced, underperforming, wasteful, and inflexible."

Why? Is it because now they have a product that can trade blows with the 980ti/TitanX, which they did not have before? It has on par sometimes greater 4k performance, despite its smaller amount of onboard memory? Small form factor that could allow it to get into some really small cases? The main issue I see is that they can't make enough of them.

Yes AMD could to better, but so could Nvidia and Intel for how many more resources they have. Why no anger towards the other two?

It's not even about marketing. AMD's cards generally do not keep up, and benchmarks still aren't the whole story. Driver support in general has been worse than Nvidia's too. Frame variance also tends to be higher on AMD cards.

 

Then those people would be in the wrong. VRR has use both above and below 60fps. I would suggest a monitor but also ask what they intend to play. If it's not AAA titles trying to max out settings, then I see no issue.

 

Again, look at the frame variances. Being able to win by 2 fps alone means nothing. AMD has always been good for consumer compute, but higher OpenCL performance? That's rare unless you're doing double precision. The reality is Nvidia has been cleaning AMD's clock in compute for ages. Those theoretical FLOPs numbers don't mean anything if you can't get near them in real world performance, the big problem AMD and OpenCL have had for years since OpenCL was created. The low use in HPC is telling enough, especially now that Xeon Phi are capturing more market share than AMD has ever enjoyed in compute accelerators despite having much lower theoretical performance numbers. The 300 series isn't pressuring Nvidia at all. If Nvidia was pressured, it would be going into a price war.

 

Trade blows? It loses across the board except for 1 or 2 games by at least 5fps. It has the potential to be far more powerful if you rearrange the ratios of the various engines. Had they come down to 3072 SPs and rebalanced the rest, the gaming performance would be unparalleled. They also locked themselves into only 4GB models which is not appealing in an age where game companies are not trying to optimize for memory usage (or at all really). Supply is a side issue for me. The GPU has fundamental, inescapable problems based on terrible decisions. Fiji by all rights should be a 4K 60fps GPU. It was improperly designed and launched anyway. Even Raja Koduri has admitted that publicly and said if he could go back he'd change almost everything.

 

Intel can't do better if it has no access to IP for graphics. On the CPU side they are miles ahead of Sandy Bridge. The problem is software all the way around. Microsoft keeps way too much legacy support forced into its operating systems, has a terrible compiler in Visual Studio that can't remotely keep up in optimization with GCC or Clang and doesn't have support for many open standards like OpenMP (only 2.0 even though 4.0 was ratified early 2013 and VS 2013 has had 5 major updates), OpenACC, and others. You can't rationally get mad at Intel for their CPUs when you're truly informed about the software landscape.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I predict we see the name ATI Radeon back.

 

Still remeber the "AMD spunsored demo" of an Fury X running dirt with 3 4k sceens at 60FPS,  LMAO'd my ass off so hard that day it still hurts.

And i was right :P

And who is gonna buy a (rebranded) R 390 with 8GB since they say on the fury "no no 4gb is okey"

Donno who does the marketing at AMD but they realy need to find someone else.

They should have fiddled with a Hawai + HBM and then later the fury with hbm2.

They did the same thing before with new design on a "midrange" card.

 

And that 8gb 390 aint realy new neither thats called a FirePro W8100 and was released june 2014...

(a year before the whole 390 lineup)

Link to comment
Share on other sites

Link to post
Share on other sites

This forum is so green -_-.

My PC: i7 3770k @ 4.4 Ghz || Hyper 212 Evo || Intel Extreme Motherboard DZ77GA || EVGA Hybrid 980ti || Corsair Vengeance Blue 16GB || Samsung 840 Evo 120 GB || WD Black 1TB

 

Peripherals: Corsair K70 RGB || Sentey Pro Revolution Gaming Mouse || Beyerdynamic DT 990 Premium 250 Ohm Headphone || Benq XL2420Z Monitor

Link to comment
Share on other sites

Link to post
Share on other sites

Well buying ATI sure did cost. For next year AMD surely needs to deliver on both CPU and GPU fronts. After all they'll have entire new architectures and use FinFET process.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Well buying ATI sure did cost. For next year AMD surely needs to deliver on both CPU and GPU fronts. After all they'll have entire new architectures and use FinFET process.

If I had to predict what is going to happen...

 

AMD's new Zen CPU is going to be a failure (ie, not a bad chip, but the hype WILL KILL IT). I'm basically going off of how Buldozer was handled/precieved. I mean, it's going to be exciting to see an AMD CPU at 14nm (equal to the offerings from Intel) with a focus on IPC/Per-Core-Performance, but gen 1 products always have kinks and cut corners.

 

As with their graphic cards? While the tessilation issues are still present, Nvidia will continue to abuse this with developers to push benchmarks/market share. It's basically the same deal as AMD's CPUs. Theoretically, the FX-8xxx chips should be equal to an i5 CPU, but developers are pushing single core engines (or very few cores, at least). When looking at benchmarks, you only need a couple poorly designed games to make the i5 look way better than an 8xxx chip, when they should be equal. Of course this is a hardware/software "incompatibiliy" issue, so AMD is also to blame (per se). Hence why they are developing Zen.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

This forum is so green -_-.

 

 

Yes, it is, AMD is down but they have enough cash to last at least until 2019.  And there are major product launches set for 2016 for both their cpu and gpu division, and another major zen based APU refresh for 2017.  They almost certainly won't be dead by then so the preemtive death rattles that people like patrick puke up  all over the web are nothing more than wishful thinking.  The vulture circling and trying to speed it along by acting as Grima Wormtongue and spreading anti amd fud and gloom to deter others from buying essentially solid products.

 

 

The big problem for amd is that these pukings across the net are legion, there are many wormtongues painting a picture of amd as awful as that decrepit Theodan and there may be no Gandaalf to come along to shine a light on reality.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

On the question of Fury X performance, I still haven't seen any official windows 10 DX12 performance numbers. Nor have I seen any VR reviews. On DX12 will the new techs incumbent allow the Fury X greater head room on VRAM and show more teeth for the lopsided SP ratio? And as for VR, if it does become a focus is not the Fury line poised to be the best in that niche with the higher bandwidth VRAM and excessive SP count?

 

Genuine question, I don't know but it seems like it was designed for release six months from now.

Link to comment
Share on other sites

Link to post
Share on other sites

If AMD dies off, I wonder who will take over. This might also add some competitiveness between Intel and Nvidia (stupid idea, but a possibility). Although on the CPU side...

 

Intel's your only option if you're looking for a CPU.

Blue Jay

CPU: Intel Core i7 6700k (OC'd 4.4GHz) Cooler: CM Hyper 212 Evo Mobo: MSI Z170A Gaming Pro Carbon GPU: EVGA GTX 950 SSC RAM: Crucial Ballistix Sport 8GB (1x8GB) SSD: Samsung 850 EVO 250 GB HDD: Seagate Barracuda 1TB Case: NZXT S340 Black/Blue PSU: Corsair CX430M

 

Other Stuff

Monitor: Acer H236HL BID Mouse: Logitech G502 Proteus Spectrum Keyboard: I don't even know Mouse Pad: SteelSeries QcK Headset: Turtle Beach X12

 

GitHub

Link to comment
Share on other sites

Link to post
Share on other sites

It's called a scaler. It scales the image to the native resolution of the panel in the monitor and adds OSD + whatever colour/contrast settings set on the monitor itself. The scaler is the primary ASIC on the monitor controller, which is the logic board/pcb with the inputs in the monitor. Scalar is a mathematical function, but also used in other ways in physics and processing (not visual): https://en.wikipedia.org/wiki/Scalar

 

 

 

It's called making a post, you type text in the box that you want other people to read and click the post button.

 

Why are you pointing out the obvious?

 

 

 

NVidia ended up spending a lot of money on making a monitor controller from the ground up with very limited functionality, instead of working with existing controller vendors. I'd prefer if NVidia had gone the Adaptive Sync route and made Gsync an industry standard. That way NVidia could ensure the proper VRR quality, but ensuring competition and wide adoption. Instead they ended up making yet another vendor lock in trap.

 

 

This is just your opinion and time will tell in the end if Nvidia wasted money or not. There is a lot more to all of this than just how the end product performs to the end user. Companies have to find ways to protect their sales and make their products unique. If they don't they end up like AMD, up the creek without an income. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

thats the fury x your thinking of.

 

That's the only one on sale now so it has to be, you can't make claims regarding prices and performance on a product that hasn't even been released.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Neither nVidia or Intel will be able to buy AMD as a whole, nVidia would get a GPU Monopoly, or Intel would get a CPU Monopoly.

So although I (kinda) hate Samsung, I hope they pump a lot of money into AMD.

However, I don't think AMD fits Samsung well, as Samsung is into mobile CPU's, and AMD is really power-hungry.

Samsung might be interested in the APU departement for phones.

I saw some rumor about AMD developing an APU, that would have the power of an r9 290x, but with a total TDP of 300w.

Link to comment
Share on other sites

Link to post
Share on other sites

the 7970 ghz edition bested every 680 on the market. The 7990 was the fastest graphics card on the planet. The 290x is on par with the 780ti, and the r9 295x2 is still the fastest graphics card on the planet. HBM was developed by AMD, why the hell would nvidia have it before AMD?

Lol, okay buddy. First of all, it did not. Plain and simple. Second, the 7990 was a card that would melt in your case with the worst driver support I've ever seen for most of its life. Third, the 295X2 will kill your power supply because it draws far more power than the cables can safely support. Fourth, you have no fucking idea how a market works, do you?

Hush.

.

Link to comment
Share on other sites

Link to post
Share on other sites

Lol, okay buddy. First of all, it did not. Plain and simple. Second, the 7990 was a card that would melt in your case with the worst driver support I've ever seen for most of its life. Third, the 295X2 will kill your power supply because it draws far more power than the cables can safely support. Fourth, you have no fucking idea how a market works, do you?

Hush.

x-x-everywhere-meme-generator-fanboys-fa

7970 ghz > 680  7900 > 690 295x2 > titan z (Dont talk about power problems, if you read the manual it says clearly you must check psu). The drivers are fixed. 

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

AMD hit a new low since 2008, 1.77 and closed the day at 1.79

 

btziQJR.png

 

god have mercy on their souls; I know I won't

Link to comment
Share on other sites

Link to post
Share on other sites

No one is going to answer with a monitor model. The majority would suggest that there would be much greater value buying a more powerful GPU and sticking to the montior they have. It doesnt make sense to spend 2 to 3 times as much on the monitor as you are to the GPU, you might want to really look at where you are putting your money. Even spending the same amount on the GPU as the monitor, logically would be questionable.

As most main stream GPUs can handle 1080p60, the only time G-sync/free sync value come into play is when people are trying to play above 2.5k to 4k. People doing this are very much on the enthusiest level who are not trying to drive a G-sync monitor with a low spec GPU. Would hardly say its AMD shooting themselves in the foot not having freesync on all cards.

 

The point is added value. What does someone upgrading from the 200 series to the equivalent 300 series actually get?

 

We have a 370 that is a rebranded 265, so it actually performs worse than it's last gen equivalent, the 270. The midrange 380 is, again, a worse performing card than last generation's 280X. The 390 is literally a 290, and the 390X is literally a 290X. There is no reason for someone who already has a GPU to upgrade to the 300 series, there is no value added over what was previously available.

 

The 380 has nothing that the 285 didn't have because it's the same GPU. The point of the 285 was that, although it wasn't as good as a 280X, it at least had the new features such as TrueAudio and FreeSync that AMD were now offering. This gave it some value over the 280X. The 380 has nothing that the 285 didn't have.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, it is, AMD is down but they have enough cash to last at least until 2019.  And there are major product launches set for 2016 for both their cpu and gpu division, and another major zen based APU refresh for 2017.  They almost certainly won't be dead by then so the preemtive death rattles that people like patrick puke up  all over the web are nothing more than wishful thinking.  The vulture circling and trying to speed it along by acting as Grima Wormtongue and spreading anti amd fud and gloom to deter others from buying essentially solid products.

 

 

The big problem for amd is that these pukings across the net are legion, there are many wormtongues painting a picture of amd as awful as that decrepit Theodan and there may be no Gandaalf to come along to shine a light on reality.

 

The forum is not green, Just because people acknowledge when a company isn't doing well doesn't make them anti that company.  If that was case everyone on the forum must absolutely hate Atari and Nintendo.   As I have said so many times before, if you want to know how biased or unbiased a forum is go look at what is being recommended in the planning sub forum.  I am guessing all these people who constantly claim this forum to be green don't spend any time down there.

 

Also for your information AMD have been at or higher than a 72% chance of bankruptcy for the last 6 years. 

http://www.gurufocus.com/news/261053/a-bankruptcy-analysis-of-amd-groupon-and-radioshack-

 

This means that regardless of what they have in the bank, it won't save them from liquidation if things turn sour. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

The point is added value. What does someone upgrading from the 200 series to the equivalent 300 series actually get?

 

We have a 370 that is a rebranded 265, so it actually performs worse than it's last gen equivalent, the 270. The midrange 380 is, again, a worse performing card than last generation's 280X. The 390 is literally a 290, and the 390X is literally a 290X. There is no reason for someone who already has a GPU to upgrade to the 300 series, there is no value added over what was previously available.

 

The 380 has nothing that the 285 didn't have because it's the same GPU. The point of the 285 was that, although it wasn't as good as a 280X, it at least had the new features such as TrueAudio and FreeSync that AMD were now offering. This gave it some value over the 280X. The 380 has nothing that the 285 didn't have.

 

Well first off...

 

380 =/= 280x..

 

The 380 launched at the same price as the R9 285, the card it was supposed to replace. Not the 280x.

 

The 390 performs significantly better than the 290 at 1440p, and bronco busts the 970. It exists to replace the 280x at a 330 dollar price point.

 

 

Im not a fan of the R9 300 series, but 60 percent of that series only exists because of AMD's inability to produce an entire series with HBM due to manfacturing difficulties. They had to give consumers SOMETHING. the Nano, Fury, Fury X, and Fury X2 or whatever its called would have never been enough for a full fledged series. The R9 300 series is a filler, it just exists to satisfy people at the lower end of the AMD spectrum that want to get those extra frames at nice price points that the R9 390 provides,

CPU: i5 4670k | Motherboard: MSI B85I | Stock cooler | RAM: 8gb DDR3 RAM 1600mhz | GPU: EVGA GTX 770 Superclocked w/ACX cooling | Storage: 1TB Western Digital Caviar Black | Case: Fractal Design Define R4 w/ Window

Link to comment
Share on other sites

Link to post
Share on other sites

Well first off...

380 =/= 280x..

The 380 launched at the same price as the R9 285, the card it was supposed to replace. Not the 280x.

The 390 performs significantly better than the 290 at 1440p, and bronco busts the 970. It exists to replace the 280x at a 330 dollar price point.

https://www.youtube.com/watch?v=4ckA_KTdaJg

Im not a fan of the R9 300 series, but 60 percent of that series only exists because of AMD's inability to produce an entire series with HBM due to manfacturing difficulties. They had to give consumers SOMETHING. the Nano, Fury, Fury X, and Fury X2 or whatever its called would have never been enough for a full fledged series. The R9 300 series is a filler, it just exists to satisfy people at the lower end of the AMD spectrum that want to get those extra frames at nice price points that the R9 390 provides,

You know what would have made the entire 300 series more appealing and worth buying? If every card had been GCN 1.2 and they'd all gotten the power delivery microarchitecture upgrade that Grenada got over Hawaii. Just that much would have made the series worth it. As for Fury, it should have been GDDR5 or HMC, and Hynix should have eaten the production costs for a product that truly had no business being in the current landscape. Honestly if AMD stopped shooting itself in the feet, it would be in much better shape today.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

You know what would have made the entire 300 series more appealing and worth buying? If every card had been GCN 1.2 and they'd all gotten the power delivery microarchitecture upgrade that Grenada got over Hawaii. Just that much would have made the series worth it. As for Fury, it should have been GDDR5 or HMC, and Hynix should have eaten the production costs for a product that truly had no business being in the current landscape. Honestly if AMD stopped shooting itself in the feet, it would be in much better shape today.

I dont know about that. I quite like the R9 Fury and its price. What I would like to see gone, is the 390x. That card has NO reason to exist. The 390x offers almost zero performance gain compared to the 290x. The Fury or the Nano should take its place. 

CPU: i5 4670k | Motherboard: MSI B85I | Stock cooler | RAM: 8gb DDR3 RAM 1600mhz | GPU: EVGA GTX 770 Superclocked w/ACX cooling | Storage: 1TB Western Digital Caviar Black | Case: Fractal Design Define R4 w/ Window

Link to comment
Share on other sites

Link to post
Share on other sites

The FX brand just screwed them for too long.

 

It may be time to buy some cheap stocks.

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

You know what would have made the entire 300 series more appealing and worth buying? If every card had been GCN 1.2 and they'd all gotten the power delivery microarchitecture upgrade that Grenada got over Hawaii. Just that much would have made the series worth it. As for Fury, it should have been GDDR5 or HMC, and Hynix should have eaten the production costs for a product that truly had no business being in the current landscape. Honestly if AMD stopped shooting itself in the feet, it would be in much better shape today.

 

To be honest i've thought that the 300 series was just a way for AMD to shift all their older 200 stock, with a few added extras, to hold them over till they can go full HBM next series. They could very easily cater to the mid high-ish end with HBM1, have HBM2 for their top tier cards and leave GDDR5 cards for low end.

(dont know how to multiquote across pages...)

The one real big draw card for the 390x is if you need a GPU with a massive amount of memory for productivity purposes and the software you use takes advantage of openCL. There is some inherent value there.

Link to comment
Share on other sites

Link to post
Share on other sites

2013 should have been a good year for AMD due to that both new consoles as well as the new Mac Pro use AMD graphics, but not even that helped. Now we can just hope that Nvidia and Intel don't team up.

Link to comment
Share on other sites

Link to post
Share on other sites

2013 should have been a good year for AMD due to that both new consoles as well as the new Mac Pro use AMD graphics, but not even that helped. Now we can just hope that Nvidia and Intel don't team up.

 

I personally don't think they will or if they did they would not be aggressive about it,  I think that could/would be construed as antitrust given there aren't any other real players in the discrete GPU segment.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×