Jump to content

AMD discrete GPU market share eroded to less than 20 per cent

Mads-Ejnar

I dunno. You can keep your discrete gpus....  

 

I want that ZEN with integrated graphics as good as the latest gen x60/x70gpu, with shared integrated HBM ram for both the CPU and GPU portions.

And then bring on the era of powerful gaming laptops and NUC sized gaming PC`s. The era of disgusting huge towers is coming to an end... good riddance.

Link to comment
Share on other sites

Link to post
Share on other sites

I dunno. You can keep your discrete gpus....  

 

I want that ZEN with integrated graphics as good as the latest gen x60/x70gpu, with shared integrated HBM ram for both the CPU and GPU portions.

And then bring on the era of powerful gaming laptops and NUC sized gaming PC`s. The era of disgusting huge towers is coming to an end... good riddance.

Why? Who actually uses an apu? And the amouunt of heat and power would give off and draw would be huge. It would have to be a 200 watt chip

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

 

One day people will stop claiming that a different cooler does anything at all to the amount of heat produced.

Why stop saying it when it's true? Resistance of any circuit is partially determined by temperature. The hotter you let it run, the more power it uses, diffused as heat by the function of current^2 * resistance of the circuit which = cross-sectional area of the wire * length * resistivity of the material * heat variance function (which is a huge mess).

 

Seriously people, I'm an American and I know this, and I'm not even an electrical engineer. Freshman year physics sequence, AP Physics C. Hell this is basic preparatory physics for the NJ high school system (at least back when I took physics in 2011 and 2012).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I dunno. You can keep your discrete gpus....  

 

I want that ZEN with integrated graphics as good as the latest gen x60/x70gpu, with shared integrated HBM ram for both the CPU and GPU portions.

And then bring on the era of powerful gaming laptops and NUC sized gaming PC`s. The era of disgusting huge towers is coming to an end... good riddance.

4 zen core + integrated GPU with DDR4 will be quite refreshing.. but zen APUs are first arriving in 2017 AFAIK..

Carrizo are however a beast with its power efficiency, and is quite suited for laptops. However the availability isn't really there..

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Why? Who actually uses an apu? And the amouunt of heat and power would give off and draw would be huge. It would have to be a 200 watt chip

We have AIOs and the NH-D15 for that. Hell IBM already gets away with the 225WW TDP Power 8 flagships (12 cores/96 threads @ 4.3GHz) on 1U cooling systems both on air and on compact water with up to 8 chips per board and have 2 boards directly linked for a 16-chip node in a 2U space.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

AMD's problem is marketing. The 200 series was largely decent and competitive, yet people just flat out didn't buy it.

One day people will stop claiming that a different cooler does anything at all to the amount of heat produced.

It was quite popular when bitcoins hit the media. However you are quite right..

When did I make such a statement?

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

yes they would - were Nvidia to be the only discrete GPU manufacturer anti-monopoly laws would kick in forcing them to split the company and get a lot of law suits headed their way.

 

Sadly, there won't be any differentiation on the "discrete" part.

 

The FTC would consider Intel and Nvidia competition in the consumer GPU segment.

 

However, Intel would be forced to split as they'd have a monopoly on personal computer processors.

 

This is why Intel's strategy at the moment is to do nothing, and torture drip-feed the market with useless new chips that aren't upgrades, while Nvidia are actively trying to destroy AMD.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

Horrible business practices, misleading/blatant advertisement about "GAMING", larger R&D budget than AMD.

In no way i'll never buy Nvidia again for 9xx and up unless they brought something like Tesla, Fermi, maybe Kepler back. 

Link to comment
Share on other sites

Link to post
Share on other sites

If AMD go bust, imagine how much a 7950 or something like that will be worth in 5 years :0

#RIPTopGear  This is the best thread ever: http://linustechtips.com/main/topic/53190-i-can-not-get-hard/ " French meetings are just people sitting in a semi-circle shouting at each other" -Dom Jolly  :lol:

My rig: 

   CPU: Pentium G3258 @ 4.5GHz GPU: GTX 760 reference | PSU: Corsair RM750 Cooler: Cooler Master Seidon 120V | Motherboard: Gigabyte B85M D3H | Case: NZXT S340 White | RAM: 8GB EVO Potenza @ 1600MHz Storage: 3TB Seagate HDD, 60GB OCZ SSD, 620GB Toshiba HDD | Mouse: Steelseries Rival @1000 CPi |  OS: Windows 10 Pro Phone: iPhone 6S 16GB  
http://linustechtips.com/main/topic/439354-why-nvidia/
 
Link to comment
Share on other sites

Link to post
Share on other sites

No they wouldn't be..?

US Monopoly Laws prevent them from having 100% market share they would have to split and that would be real messy.

CIS Student

Bleeding Panther

 

Spoiler

 

Intel Core i7-5820K | Deepcool CAPTAIN 360EX | Thermatake TG-7 | Gigabyte AORUS Xtreme Edition GTX 1080 Ti | MSI X99S Gaming 7 | G.Skill Ripjaws V Series 16GB (4 x 4GB) | Samsung 970 Evo 1TB | Samsung 850 EVO-Series 250GBx2 | Hitachi Ultrastar 7K3000 3TB x2 | Phanteks Enthoo EVOLV ATX Tempered Glass | EVGA 850 G2 | Corsair Air Series SP120x6 | Corsair STRAFE Wired Gaming Keyboard | Logitech MX Master 2S | Audio-Technica M50x Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

US Monopoly Laws prevent them from having 100% market share they would have to split and that would be real messy.

FFS they do not. They only prevent you from achieving that 100% market share by anticompetitive practices defined in the laws, from abusing your position as market leader via price gouging/fixing and/or bribing OEMs and/or retailers to not carry competitors' products, and buying up all your competitors to the point you poison the well of competition. Getting a monopoly by earning it through raw competition and winning is perfectly legal in the American system.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Sadly, there won't be any differentiation on the "discrete" part.

 

The FTC would consider Intel and Nvidia competition in the consumer GPU segment.

 

However, Intel would be forced to split as they'd have a monopoly on personal computer processors.

 

This is why Intel's strategy at the moment is to do nothing, and torture drip-feed the market with useless new chips that aren't upgrades, while Nvidia are actively trying to destroy AMD.

No they wouldn't. Intel has no AIB solutions, and in fact it's AMD's and Nvidia's patents which prevent this. Further, integrated and discrete graphics do not compete with each other at all outside business/office computers. Even further, under DX 12 and Vulcan there is multiadaptor support if programmers choose to use it. Intel would not be considered competition to Nvidia in the consumer segment at all except perhaps cellphone and tablet SOCs. The other major area is HPC where you have the Xeon Phi vs. the Teslas.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia hasn't for years now.. This is nothing new.

At least they don't sit on their asses and not do shit quite as much as Intel has the last few years....

CPU: I7 3770k @4.8 ghz | GPU: GTX 1080 FE SLI | RAM: 16gb (2x8gb) gskill sniper 1866mhz | Mobo: Asus P8Z77-V LK | PSU: Rosewill Hive 1000W | Case: Corsair 750D | Cooler:Corsair H110| Boot: 2X Kingston v300 120GB RAID 0 | Storage: 1 WD 1tb green | 2 3TB seagate Barracuda|

 

Link to comment
Share on other sites

Link to post
Share on other sites

I like AMD so this is really sad and worrying news for me. Not only does Nvidia get closer to having a monopoly and thus they can pretty much do whatever they want but this also (obviously) won't help AMD with their financing.

Link to comment
Share on other sites

Link to post
Share on other sites

The 750Ti was released in early February of 2014.It's and old card now, and it's about to be refreshed.It's also part of the 1st maxwell generation, not 2nd like the 9xx series.

The 960 is worse than AMD's counterpart, that's true.

The 970 and 390 are about equal. You won't notice the difference. Maybe only at higher resolutions. It's all about the price : the cheapest one wins. Dunno about you guys in the US, but the 970 is cheaper than an R9 390 here.

I am in the UK and currently they are the exact same price, but obviously they fluctuate.

Link to comment
Share on other sites

Link to post
Share on other sites

apparently people here on LTT want so see AMD die ...... so NVIDIA and INTEL can charge way more for their products... go ahead...

AMD Rig - (Upgraded): FX 8320 @ 4.8 Ghz, Corsair H100i GTX, ROG Crosshair V Formula, Ghz, 16 GB 1866 Mhz Ram, Msi R9 280x Gaming 3G @ 1150 Mhz, Samsung 850 Evo 250 GB, Win 10 Home

(My first Intel + Nvidia experience  - recently bought ) : MSI GT72S Dominator Pro G ( i7 6820HK, 16 GB RAM, 980M SLI, GSync, 1080p , 2x128 GB SSD + 1TB HDD... FeelsGoodMan

Link to comment
Share on other sites

Link to post
Share on other sites

FFS they do not. They only prevent you from achieving that 100% market share by anticompetitive practices defined in the laws and from abusing your position as market leader via price gouging/fixing and/or bribing OEMs and/or retailers to not carry competitors' products. Getting a monopoly by earning it through raw competition and winning is perfectly legal in the American system.

In Romania we have 2 big ISPs: UPC and RCS RDS. A few years ago RCS RDS wanted to buy UPC, giving them like 95% marketshare.However, our gov. didn't allow the purchase since it would create a monopoly.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

In Romania we have 2 big ISPs: UPC and RCS RDS. A few years ago RCS RDS wanted to buy UPC, giving them like 95% marketshare.However, our gov. didn't allow the purchase since it would create a monopoly.

Purchases to gain monopolies are also prevented based on a couple simple mathematical functions which determine societal benefit and loss/gain of competition. But the broad brush stroke used was entirely incorrect.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Let me just drop  this here:

 

http://www.eurogamer.net/articles/2013-01-16-former-amd-employees-accused-of-stealing-secrets-for-nvidia

 

This article is from Januari 2013, the point where AMD has been losing it big is around the start of 2014! Call me tinfoiled, but AMD has been losing their battle ever since both Nvidia and Intel started taking their employees.

 

Also and I keep mentioning this: AMD's marketing and naming schemes aren't clear to me! R7, R9, Fury / FX, A6, A8, A10, Athlon, Phenom.

 

Now compare that to Nvidia and Intel: GT(X) / Celeron, Pentium, Core M, i3, i5, i7.

 

Also, when you look on YouTube for example, we always see comparisons being made to the Nvidia cards as well as the Intel CPU's!

 

Even with new Nvidia cards, the comparisons are made between last gen cards instead of the AMD cards!

Link to comment
Share on other sites

Link to post
Share on other sites

Let me just drop  this here:

 

http://www.eurogamer.net/articles/2013-01-16-former-amd-employees-accused-of-stealing-secrets-for-nvidia

 

This article is from Januari 2013, the point where AMD has been losing it big is around the start of 2014! Call me tinfoiled, but AMD has been losing their battle ever since both Nvidia and Intel started taking their employees.

 

Also and I keep mentioning this: AMD's marketing and naming schemes aren't clear to me! R7, R9, Fury / FX, A6, A8, A10, Athlon, Phenom.

 

Now compare that to Nvidia and Intel: GT(X) / Celeron, Pentium, Core M, i3, i5, i7.

 

Also, when you look on YouTube for example, we always see comparisons being made to the Nvidia cards as well as the Intel CPU's!

 

Even with new Nvidia cards, the comparisons are made between last gen cards instead of the AMD cards!

 

AMD problem is a marketing problem. That's all.

 

AMD with a way lower R&D budget is on par with NVIDIA, and it's bringing new tech to the industry and pushing industry forward in several fronts.

 

Like I said - show us each marketing budget and then we can talk about market share. You see alot of comments here talking literally shit, thinking that the market share = tech superiority lol... and they actually belive in that xD

 

Ofc such news snowball into kids thinking that one rules over the other because the high end card of NVIDIA has 5 more frames per second on average then AMD high end card.

 

And remember - you can be a grown man and still be a kid :)

Link to comment
Share on other sites

Link to post
Share on other sites

 

Why stop saying it when it's true? Resistance of any circuit is partially determined by temperature. The hotter you let it run, the more power it uses, diffused as heat by the function of current^2 * resistance of the circuit which = cross-sectional area of the wire * length * resistivity of the material * heat variance function (which is a huge mess).

 

Wow ok this hinges on a bunch of assumptions. Firstly you're assuming that a CPU, which is a bunch of semi-conductors, behaves identically to a length of copper wire. Huge citation needed here.

 

Secondly you are overestimating the impact of the heat on the resistance by a huge amount. This equation is useful when determining the resistance curve of a lightbulb, ie something getting so hot that it is glowing. It is hundreds of degrees Celsius. For a given clock speed what difference can an aftermarket vs stock cooler actually make? 40 degrees? That is nothing like enough to add any appreciable resistance.

 

No need to point out that you are not an electrical engineer, that's quite apparent.

 

Horrible business practices, misleading/blatant advertisement about "GAMING", larger R&D budget than AMD.

In no way i'll never buy Nvidia again for 9xx and up unless they brought something like Tesla, Fermi, maybe Kepler back. 

 

...you're pissed off that the 900 series was advertised as gaming cards? o.0

Link to comment
Share on other sites

Link to post
Share on other sites

Wow ok this hinges on a bunch of assumptions. Firstly you're assuming that a CPU, which is a bunch of semi-conductors, behaves identically to a length of copper wire. Huge citation needed here.

Secondly you are overestimating the impact of the heat on the resistance by a huge amount. This equation is useful when determining the resistance curve of a lightbulb, ie something getting so hot that it is glowing. It is hundreds of degrees Celsius. For a given clock speed what difference can an aftermarket vs stock cooler actually make? 40 degrees? That is nothing like enough to add any appreciable resistance.

No need to point out that you are not an electrical engineer, that's quite apparent.

...you're pissed off that the 900 series was advertised as gaming cards? o.0

This law applies for all resistors, regardless of extra qualities (Halliday, Resnick Fundamentals of Physics 6th Edition). All conductors are resistors unless they are superconductors.

The temperature factor is a quadratic order function. You're underestimating.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

Wow ok this hinges on a bunch of assumptions. Firstly you're assuming that a CPU, which is a bunch of semi-conductors, behaves identically to a length of copper wire. Huge citation needed here.

 

Secondly you are overestimating the impact of the heat on the resistance by a huge amount. This equation is useful when determining the resistance curve of a lightbulb, ie something getting so hot that it is glowing. It is hundreds of degrees Celsius. For a given clock speed what difference can an aftermarket vs stock cooler actually make? 40 degrees? That is nothing like enough to add any appreciable resistance.

 

No need to point out that you are not an electrical engineer, that's quite apparent.

 

 

...you're pissed off that the 900 series was advertised as gaming cards? o.0

 

I'm just pissed that it's straight up blatant stupid advertising.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×