Jump to content

AMD Worries Cryptomining GPU Demand Could Fall

11 minutes ago, Blademaster91 said:

Well yes desktops have been for quite a while,but I don't get why they project that tablets will go up? Tablets are already selling less when a smartphone does everything a tablet can. However I meant the APU laptops are a bit of a niche,as most don't need better than a igpu,those that want a gaming laptop are buying them with a discrete gpu.

The Intel + Vega combo does give a really good counter to the GTX 1060 style gaming laptops though, laptops with higher end discrete GPUs really don't sell in large volumes. I honestly don't see Intel pushing products to use that widely though, if it was all their own tech I could see them pushing that in to all laptops that are not ultra portable but that would be feeding AMD just a bit too much.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Humbug said:

But 2-4 years ago it was all doom and gloom. Most people thought that AMD was done and that they would never comeback from bulldozer.

Yes, that is very true,  they scored a 72% + chance of bankruptcy for 5 years straight (2009-2014).  Nearly every economist agreed they were in a very bad way.  It wasn't until 2015/6 that their plan was observable and people could see tangible evidence they were moving forward.  Even now that they are well on their way their financials aren't exactly amazing so I wouldn't say they are out of the woods yet.    They might be improving greatly but they still have to win the market share battle on two fronts, which means another few years of proving themselves in terms of product quality/longevity.   

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

This mining craze has been going around for some quite long time now. Do you think there will come any end to this?

It's now really hard to believe there will be an end to this and the GPU prices still going up and stays high.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/2/2018 at 12:54 PM, Blademaster91 said:

Fixed that for you.

While not nearly that profitable in a couple of months at 65% power target, it would be simple enough to slap on some AS5 or Conductonaught with some nail polish over the vulnerable electronics around the GPU and the fans wouldn't be remotely that bad at such a low power target.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ravenshrike said:

While not nearly that profitable in a couple of months at 65% power target, it would be simple enough to slap on some AS5 or Conductonaught with some nail polish over the vulnerable electronics around the GPU and the fans wouldn't be remotely that bad at such a low power target.

Even my old as hell 6970 reference cooler is still going strong, running in a friends computer right now. I ran that thing for years never turning my computer off and the fan still goes fine, a few years mining won't really do much to make a card prematurely fail. A lot of fuss over not a lot, GPUs can break like everything else can but mining won't make it break much sooner if at all.

Link to comment
Share on other sites

Link to post
Share on other sites

I just hope that whatever profits AMD makes from the crypto craze will be enough to sustain themselves and reinvest enough into R&D to keep up. Not gonna lie, Vega was pretty underwhelming in gaming performance. I'm sure this will sound too familiar, but lets hope that the next architecture (Navi) won't disappoint. However, I love AMD cards for their feature set like Freesync, Crossfire, and their driver software for instance. As a person who already has a GTX 1070, I cannot wait for the used market to be flooded with cheap AMD cards so I can finally use Freesync on my monitor. 

I have no loyalty to either Nvidia or AMD, I just don't want the market share to become extremely polarized to either side, resulting in a monopoly. 

(For those of you who ask why I bought a Freesync monitor if I own a Nvidia card, it is because I got a good deal on a 144hz monitor that just happened to have Freesync, and I couldn't afford the extra $200 on a G-sync display)

 

This is gonna break this down like a 5th grade presentation,

Why doesn't Nvidia or AMD invest in developing ASICs? They could have a foothold in both markets, and Miners wouldn't take as many cards away from the gaming market place. They have enough money to invest in R&D and the technical expertise in silicon architecture, power efficiency, software design, and marketing. Their brand could appeal to casual miners who would otherwise buy off the shelf GPUs. Additionally, they could appeal to casual miners by developing an intuitive software interface to manage their ASIC hardware. Finally, the used market would not be flooded with used GPUs after a crypto crash. This could harm both AMD and Nvidia because people would buy used cards instead of new cards. As a result, this would hurt both companies income, and would take away money that could otherwise be reinvested into R&D for new products. Furthermore, if the used market does not become flooded, consumers wouldn't he harmed either. New GPUs would be affordable because miners would demand the ASICs instead of ordinary GPUs. Thus, preventing the hyper-inflation of graphics cards prices we see today.

 

This is vastly oversimplified, very theoretical, and has no chance of actually happening. But it's fun to dream...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Aw_Ginger_Snapz said:

I just hope that whatever profits AMD makes from the crypto craze will be enough to sustain themselves and reinvest enough into R&D to keep up.

 

AMD's made profits again in 2017 mostly thanks to the CPU division and the success of products based on Zen cores.

http://www.tomshardware.com/news/amd-stock-financials-earnings-cpu,36430.html

Profits will be bigger in 2018 because you will have much more products based on Zen cores on the market for the majority of the year including lucrative APUs for the laptop and desktop market and epyc server parts. In 2017 Zen was only targetted at enthusiasts DIY PC guys like us.

 

I too hope that The Radeon Technology Group will start competing again on the high end, but this factor is not going to make or break AMD when you look at the numbers. Basically they can afford to fail, and try again. Whereas if Zen had failed AMD would have been done.

 

They are ramping up their R&D because they finally can afford to. However you will not see the fruits of this for years.

https://overclock3d.net/news/cpu_mainboard/amd_has_increased_their_r_d_budget_by_almost_22_in_the_past_year/1

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Aw_Ginger_Snapz said:

 

Why doesn't Nvidia or AMD invest in developing ASICs?

If they do wouldn't it lose the flexibility of a general purpose GPU. I.e the ability to compute or mine anything: bitcoin, ethereum or any other currency.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Humbug said:

I too hope that The Radeon Technology Group will start competing again on the high end, but this factor is not going to make or break AMD when you look at the numbers. Basically they can afford to fail, and try again. Whereas if Zen had failed AMD would have been done.

 

 

I agree. I just can't see AMD consciously continue to compete in the graphics card market if they continuously lose money. After all, why would a business take away R&D money from a market they can compete in (CPU), and instead invest it  into a market that they can't compete in (GPU).

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, leadeater said:

Even my old as hell 6970 reference cooler is still going strong, running in a friends computer right now. I ran that thing for years never turning my computer off and the fan still goes fine, a few years mining won't really do much to make a card prematurely fail. A lot of fuss over not a lot, GPUs can break like everything else can but mining won't make it break much sooner if at all.

Depending on the design of the card, secondary parts can fail without adequate cooling if run 24/7. But that's not remotely a concern in a single GPU rig

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Humbug said:

If they do wouldn't it lose the flexibility of a general purpose GPU. I.e the ability to compute or mine anything: bitcoin, ethereum or any other currency.

No. The idea is for these ASIC machines to be a separate device from a GPU. The idea wasn't for AMD/Nvidia to stop selling regular GPUs. The point is for them to make ASICs that will encourage encourage miners to stop buying GPUs, and instead buy a standalone device that will mine only currencies. Thus, lowering the demand for GPUs, and making them cheaper as a result.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Aw_Ginger_Snapz said:

No. The idea is for these ASIC machines to be a separate device from a GPU. The idea wasn't for AMD/Nvidia to stop selling regular GPUs. The point is for them to make ASICs that will encourage encourage miners to stop buying GPUs, and instead buy a standalone device that will mine only currencies. Thus, lowering the demand for GPUs, and making them cheaper as a result.

Would the manufacture of these ASICs not require much of the same electronic components or fabrication that are required to make a graphics card? So won't it further dwindle supply of GPUs?

Link to comment
Share on other sites

Link to post
Share on other sites

Image result for oh look at all the fucks i give

 

Sorry, having to struggle to build budget systems for people right now due to the crypto craze, AMD can just suck it up.

PC - NZXT H510 Elite, Ryzen 5600, 16GB DDR3200 2x8GB, EVGA 3070 FTW3 Ultra, Asus VG278HQ 165hz,

 

Mac - 1.4ghz i5, 4GB DDR3 1600mhz, Intel HD 5000.  x2

 

Endlessly wishing for a BBQ in space.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Humbug said:

Would the manufacture of these ASICs not require much of the same electronic components or fabrication that are required to make a graphics card? So won't it further dwindle supply of GPUs?

While yes, ASICs use many of the same components, the architecture of the silicon, power design, memory footprint can all vary. Like I said in the original post, this is all a speculated scenario. I do not know what kind of architecture either company would use. So the only answer I can give you is it depends. The most limiting factor to this would most likely be the memory. After all, we're in a global DRAM famine because of the mobile phone market.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/1/2018 at 8:57 PM, Energycore said:

Except for some reason R9 280s and 280Xs, you can find those for very cheap used. I'll see sometimes an auction for a working R9 280 closing below $100 on ebay.

 

OT: I think AMD might be worried about the used market being flooded like what happened in 2014. Back then the crypto space dove into a bear market for a year and change, and every card that had been used for mining was fed into the used market, taking away from new GPU sales.

 

Nvidia didn't care about that because people will buy a rusty toaster for $500 if it's got the Geforce branding on it.

Remember when could you get fucking R9 290s for 200ish USD! Those were the days...

If you want to reply back to me or someone else USE THE QUOTE BUTTON!                                                      
Pascal laptops guide

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×