Jump to content

Crypto miners are hoarding early shipments of RTX 3080s

Re: Vega VII
iirc hardware unboxed recently did a thing on the VegaVII.  I got the impression it hadn’t aged all that well.

 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, valdyrgramr said:

To be fair to the card the original point was a hold over for professionals and gamers waiting on Navi to launch.  After that,unless you did like server work or rendering, it made little to no sense.  The card wasn't discontinued when people kept saying either, but only AMD was really stocking it because it was a hard product to move aka a niche product.   On the driver side there were issues that the XT also faced with multitasking.  The screens would black out and your system pretty much hung from there.

Ironically I understand the large amount of really fast memory still makes it the best card around for one or two even more niche things.  Don’t remember what exactly off hand.  They’re pretty specific though.  Like weird single use functions having nothing whatsoever to do with gaming. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Kisai said:

No, but the Power companies know exactly who is doing large scale crypto coin mining, because they give them bulk discounts on energy to burn dirty power. Even people who are doing it in their home are easily detected, less you forget that "grow ops" are still illegal in many places.

 

https://www.vice.com/en_us/article/mbk7wy/how-much-energy-growing-weed-vs-mining-bitcoin

 

https://www.cryptoglobe.com/latest/2019/02/australia-police-raid-cryptocurrency-miner-s-home-seeking-weed-grow-op/

 

https://techland.time.com/2011/05/23/report-police-confuse-bitcoin-miners-power-use-for-weed-grow-op/

 

 

Like at worst, if you live in a typical apartment, it has 100A service (mine has 4 15A circuits, the rest is the stove) so the maximum usable power is 172.8kWh/day, if I were to unplug the fridge and not do anything to the stove. BC's power is 10.29c/kWh so $17.79/day it would cost to maximize the power use in the apartment without any alterations to it. Assuming a residential home could maximize an entire 100A circuit, That's 288kWh/day or 8640kWh/mo that's still just 29.64/mo. However that's a gross estimate since the actual cost is tiered here. Actual residential is about 9.3c for the first 1350kWh and then it's 14c per kwh, and likewise the (legal) farm rate is 11.2c/kWh

 

 

If you read that last article, you'll note that the one city in BC permits the RCMP to search a home if it uses more than 93kWh per day. Which if you do the math is slightly under one third or 8 of 24 hours of a 100A service. So the assumption is that short of having a hot tub or heated pool, there is no logical explanation for using more than 32.3A per day. 30A itself is two standard 15A circuits. And a typical PC that runs 24 hours simply doesn't hit that. You likely won't even cross that threshold unless you have a 7 HEDT desktops maxed out and running 24 hours per day.

 

Meanwhile, in rural America, your average household has 200A 240v service…though average single family home usage is around 600kwh/month.  Most also have the option to get 3 phase business lines if more power or 3 phase use is desired, such as for some shop applications.  Generally, we don't have teired power for residential, though more places are moving towards TOU (time of use) pricing.  Most business options get significantly more expensive the higher your peak burst draw is (demand charges), until you're over 100MW of normal use.  Average "business" use across America is just over 6MWh/month (roughly 10x average home use).  Power pricing varies a lot though…California power is more than 2x as expensive as Nevada for example, while someplace like Hawaii is 2x Cali pricing.

 

Me, I have that usual rural 200A 240v service coming in…and 3.1kw of solar on the roof, and another 5kw solar install in progress (making a large patio/tractor storage spot with a solar roof).  They're not 100 optimal in setup though (one is West facing, as an example), but if they were, and I did some weather/time of day scheduling on a ton of mining rigs, the utilities wouldn't have a clue.  If I instead decided to max out everything while having an optimal setup, I could theoretically pull about 56MW on a sunny day…though trying to wire correctly to do that safely would be insanity.  LOL

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, justpoet said:

Meanwhile, in rural America, your average household has 200A 240v service…though average single family home usage is around 600kwh/month.  Most also have the option to get 3 phase business lines if more power or 3 phase use is desired, such as for some shop applications.  Generally, we don't have teired power for residential, though more places are moving towards TOU (time of use) pricing.  Most business options get significantly more expensive the higher your peak burst draw is (demand charges), until you're over 100MW of normal use.  Average "business" use across America is just over 6MWh/month (roughly 10x average home use).  Power pricing varies a lot though…California power is more than 2x as expensive as Nevada for example, while someplace like Hawaii is 2x Cali pricing.

 

Me, I have that usual rural 200A 240v service coming in…and 3.1kw of solar on the roof, and another 5kw solar install in progress (making a large patio/tractor storage spot with a solar roof).  They're not 100 optimal in setup though (one is West facing, as an example), but if they were, and I did some weather/time of day scheduling on a ton of mining rigs, the utilities wouldn't have a clue.  If I instead decided to max out everything while having an optimal setup, I could theoretically pull about 56MW on a sunny day…though trying to wire correctly to do that safely would be insanity.  LOL

You do not live where I live.  I live in an older urban residential neighborhood.  It’s never been zoned commercial.  100a is the max here and 3 phase is astoundingly expensive because they have to build it out.  Also in the US the standard for apartments is 60 amps and you can get 4 of them (barely) on a 100amp service.

 

That there are places like that I definitely believe.  I don’t know how many of the there actually are though.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

I went down the rabbit hole this morning on CUDA/TensorFlow stuff today and I think there may actually be a market there for used RTX cards.

 

fp32-2080ti-1.png

 

https://lambdalabs.com/blog/2080-ti-deep-learning-benchmarks/

 

Like a GTX 1080Ti and a RTX 2080 (presumably running CUDA alone) have the same performance presumably because the RTX2080 has less CUDA cores. So if a "mining" GPU isn't worth using anymore for crypto, it's probably still very useful for AI Learning stuff, even if it's a little more expensive to run, it's not usually a pegged to 100% the entire time.

Link to comment
Share on other sites

Link to post
Share on other sites

The crypto miners should buy up all the used 2080 graphics cards and run them to the ground until they burn up. 

Link to comment
Share on other sites

Link to post
Share on other sites

*PTSD flashbacks*

In real life, I act like I know less about tech than I actually do.

 

Main build: CPU: AMD Phenom ii X4 955 Black Edition @ 4,0GHz (Cooler Master 212) | GPU: MSI RX580 8Gb | RAM: 12Gb DDR3 | PSU: Chieftec CTB-500S  | Mobo: Asus M4A87TD/USB3 | Storage: Seagate 500Gb HDD

Yes, I'm aware my CPU is a huge bottleneck. No, I don't really care.

 

Laptop: Asus TUF FX505DT 60Hz 8Gb model

 

"Fortifications, cannons and foreign aid won't help unless every man knows that he himself is a guardian of his country"

-Carl Gustaf Emil Mannerheim

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/8/2020 at 7:52 PM, RejZoR said:

I hope this crap is fake. We really don't need this nonsense again... Also all this mining nonsense needs to die.

Correct.

 

Mining needs to die. It is just a bubble built on nothing.

 

Also, all cryptomining uses more power worldwide than a small country (Switzerland: https://www.theverge.com/2019/7/4/20682109/bitcoin-energy-consumption-annual-calculation-cambridge-index-cbeci-country-comparison). So not only is it annoying, it is actually harmful to the environment in terms of power usage.

 

I thought all this mining nonsense was not profitable anymore anyway? If this is true, I guess I will be sticking with my GTX 980Ti for a while longer.. :)

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, rrubberr said:

If someone wants to toss me three or four 2080s for OpenCL and CUDA acceleration I'd be glad to take them 👼

I’m remembering a post recently where @LAwLz or @Kisai or someone (I always get them confused, it’s the pigtails) posted an interesting graph on various compute things.  That’s actually perhaps not impossible according to those graphs.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Bombastinator said:

I’m remembering a post recently where @LAwLz or @Kisai or someone (I always get them confused, it’s the pigtails) posted an interesting graph on various compute things.  That’s actually perhaps not impossible according to those graphs.

You're probably thinking of @Kisai. That post is on the previous page, here:  

3 hours ago, Kisai said:

I went down the rabbit hole this morning on CUDA/TensorFlow stuff today and I think there may actually be a market there for used RTX cards.

 

fp32-2080ti-1.png

 

https://lambdalabs.com/blog/2080-ti-deep-learning-benchmarks/

 

Like a GTX 1080Ti and a RTX 2080 (presumably running CUDA alone) have the same performance presumably because the RTX2080 has less CUDA cores. So if a "mining" GPU isn't worth using anymore for crypto, it's probably still very useful for AI Learning stuff, even if it's a little more expensive to run, it's not usually a pegged to 100% the entire time.

 

Here is the source for those benchmarks:

https://lambdalabs.com/blog/2080-ti-deep-learning-benchmarks/

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Bombastinator said:

I’m remembering a post recently where @LAwLz or @Kisai or someone (I always get them confused, it’s the pigtails) posted an interesting graph on various compute things.  That’s actually perhaps not impossible according to those graphs.

I posted that on the previous page of this thread. And my avatar is original and unique.

 

The thing with the OpenCL/CUDA compute stuff, is that it depends on the libraries used. The stuff I was looking at was explicitly CUDA 10.0 Toolkit, and this gets confusing when there's three different version numbers referenced (10.0 Toolkit, Compute capability , CuDNN 7.6.5 for Cuda 10.0)

https://www.tensorflow.org/install/ https://www.tensorflow.org/install/gpu

Quote

The following GPU-enabled devices are supported:

  • NVIDIA® GPU card with CUDA® architectures 3.5, 3.7, 5.2, 6.0, 6.1, 7.0 and higher than 7.0. See the list of CUDA®-enabled GPU cards.
  • On systems with NVIDIA® Ampere GPUs (CUDA architecture 8.0) or newer, kernels are JIT-compiled from PTX and TensorFlow can take over 30 minutes to start up. This overhead can be limited to the first start up by increasing the default JIT cache size with: 'export CUDA_CACHE_MAXSIZE=2147483648' (see JIT Caching for details).
  • For GPUs with unsupported CUDA® architectures, or to avoid JIT compilation from PTX, or to use different versions of the NVIDIA® libraries, see the Linux build from source guide.
  • Packages do not contain PTX code except for the latest supported CUDA® architecture; therefore, TensorFlow fails to load on older GPUs when CUDA_FORCE_PTX_JIT=1 is set. (See Application Compatibility for details.)

So Ampere (the Geforce RTX 3000 parts) needs CuDNN 8.x to make use of new features on those cards. cuDNN 8.x still supports Kepler (Geforce GTX 780 or higher) parts, but only specific models with compute capability 3.5+ . Geforce GTX 10xx parts are 6.1 and RTX 20xx parts are 7.5.

 

Confused enough yet? Basically any GTX 780 part or better is still usable for Cuda Neural Net stuff, but the libraries used by still in the last year pretty much require a GTX 10xx or 20xx card.

 

Which means that if you're doing this kind of stuff anyway, any working 10xx or 20xx part is interchangeable (eg xx80 with another xx80 part.)

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Kisai said:

Confused enough yet? Basically any GTX 780 part or better is still usable for Cuda Neural Net stuff, but the libraries used by still in the last year pretty much require a GTX 10xx or 20xx card.

 

Which means that if you're doing this kind of stuff anyway, any working 10xx or 20xx part is interchangeable (eg xx80 with another xx80 part.)

 

All libraries support any nvidia card from the 600 series onward (never saw anyone trying with anything older).

 

You can use any card interchangeably. The really annoying part is installing all the correct software versions (in case you need specific ones) in distros such as ubuntu or debian, and that's independent of the GPU itself.

 

13 hours ago, Kisai said:

Like a GTX 1080Ti and a RTX 2080 (presumably running CUDA alone) have the same performance presumably because the RTX2080 has less CUDA cores. 

That's for FP32. In FP16 the 2080 can be twice as fast. Even a regular 2060 blows the 1080 out of the water with lower precision.

10 hours ago, rrubberr said:

If someone wants to toss me three or four 2080s for OpenCL and CUDA acceleration I'd be glad to take them 👼

I actually want to snatch an used 2080Ti since it'll be better than a 3070 for ML haha

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, igormp said:

All libraries support any nvidia card from the 600 series onward (never saw anyone trying with anything older).

The 780 (non-mobile) is the lowest card that supports compute 3.5 https://developer.nvidia.com/cuda-gpus

 

Quote

You can use any card interchangeably. The really annoying part is installing all the correct software versions (in case you need specific ones) in distros such as ubuntu or debian, and that's independent of the GPU itself.

 

That's for FP32. In FP16 the 2080 can be twice as fast. Even a regular 2060 blows the 1080 out of the water with lower precision.

I actually want to snatch an used 2080Ti since it'll be better than a 3070 for ML haha

I'm sure it does. Just pointing out that if one is buying old mining kit, it's easier to grab these regardless of the condition they were used in. I wouldn't pick one for gaming however since a new RTX 30xx card would be cheaper, and present gaming requirements don't really justify a 3080 or 3090.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Kisai said:

The 780 (non-mobile) is the lowest card that supports compute 3.5 https://developer.nvidia.com/cuda-gpus

That's only for the newest binery versions of the libraries (which, believe me, are hardly ever used).

Tf with versions prior to 1.10 still support compute 3.0, and you can always built even the newest version from source to support it.

 

Trust me, that's irrelevant.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/8/2020 at 12:15 PM, AndreiArgeanu said:

Oh noooo, now I can't waste the 700$ that I don't have on a new graphics card, what am I gonna do?

While I understand your attempt at a joke.... there are a lot more people who have excess $700 and want a new card.

 

Just because you can't empathize with people wanting to spend their own money on something you don't see as value, the issue of lack of supply would still suck if true.

"Do what makes the experience better" - in regards to PCs and Life itself.

 

Onyx AMD Ryzen 7 7800x3d / MSI 6900xt Gaming X Trio / Gigabyte B650 AORUS Pro AX / G. Skill Flare X5 6000CL36 32GB / Samsung 980 1TB x3 / Super Flower Leadex V Platinum Pro 850 / EK-AIO 360 Basic / Fractal Design North XL (black mesh) / AOC AGON 35" 3440x1440 100Hz / Mackie CR5BT / Corsair Virtuoso SE / Cherry MX Board 3.0 / Logitech G502

 

7800X3D - PBO -30 all cores, 4.90GHz all core, 5.05GHz single core, 18286 C23 multi, 1779 C23 single

 

Emma : i9 9900K @5.1Ghz - Gigabyte AORUS 1080Ti - Gigabyte AORUS Z370 Gaming 5 - G. Skill Ripjaws V 32GB 3200CL16 - 750 EVO 512GB + 2x 860 EVO 1TB (RAID0) - EVGA SuperNova 650 P2 - Thermaltake Water 3.0 Ultimate 360mm - Fractal Design Define R6 - TP-Link AC1900 PCIe Wifi

 

Raven: AMD Ryzen 5 5600x3d - ASRock B550M Pro4 - G. Skill Ripjaws V 16GB 3200Mhz - XFX Radeon RX6650XT - Samsung 980 1TB + Crucial MX500 1TB - TP-Link AC600 USB Wifi - Gigabyte GP-P450B PSU -  Cooler Master MasterBox Q300L -  Samsung 27" 1080p

 

Plex : AMD Ryzen 5 5600 - Gigabyte B550M AORUS Elite AX - G. Skill Ripjaws V 16GB 2400Mhz - MSI 1050Ti 4GB - Crucial P3 Plus 500GB + WD Red NAS 4TBx2 - TP-Link AC1200 PCIe Wifi - EVGA SuperNova 650 P2 - ASUS Prime AP201 - Spectre 24" 1080p

 

Steam Deck 512GB OLED

 

OnePlus: 

OnePlus 11 5G - 16GB RAM, 256GB NAND, Eternal Green

OnePlus Buds Pro 2 - Eternal Green

 

Other Tech:

- 2021 Volvo S60 Recharge T8 Polestar Engineered - 415hp/495tq 2.0L 4cyl. turbocharged, supercharged and electrified.

Lenovo 720S Touch 15.6" - i7 7700HQ, 16GB RAM 2400MHz, 512GB NVMe SSD, 1050Ti, 4K touchscreen

MSI GF62 15.6" - i7 7700HQ, 16GB RAM 2400 MHz, 256GB NVMe SSD + 1TB 7200rpm HDD, 1050Ti

- Ubiquiti Amplifi HD mesh wifi

 

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, Dedayog said:

While I understand your attempt at a joke.... there are a lot more people who have excess $700 and want a new card.

 

Just because you can't empathize with people wanting to spend their own money on something you don't see as value, the issue of lack of supply would still suck if true.

You're saying that as if 99% of people have 700$ to spend on a graphics card. Generally speaking less than 1% of people who are buying GPU's will be getting the 3080, or just spend that much money on a new GPU. There's a reason to why the top 5 most popular GPU's in the steam hardware survey are occupied by cheaper gpu's mostly under 400$, and not 2080's and 2080 ti's.

And I don't understand what's with the empathy part, ain't no one crying or dying over the fact that there's may be a chance for GPU shortages, this ain't no tragedy, there's no need for empathy. If you can't take a joke move on and don't be butthurt about it.

image.png.bf92bd78d196f049c39efa37e8227f6a.png

image.png.ecb4e02522a81e7cd76cded85c12103a.png

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/8/2020 at 12:10 PM, Master Disaster said:

It really doesn't matter to Nvidia, a sale is a sale at the end of the day.

What happens when the mining market dies and the PC gaming community just migrates to consoles because they're fed up with $1500 midtier cards? 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, descendency said:

What happens when the mining market dies and the PC gaming community just migrates to consoles because they're fed up with $1500 midtier cards? 

 

Nothing, most of their money comes from selling qudro's to servers, desktop is a side business. It's the same with AMD only more so since they're also in consoles which are the second biggest DGPU market after servers.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Kisai said:

I posted that on the previous page of this thread. And my avatar is original and unique.

 

The thing with the OpenCL/CUDA compute stuff, is that it depends on the libraries used. The stuff I was looking at was explicitly CUDA 10.0 Toolkit, and this gets confusing when there's three different version numbers referenced (10.0 Toolkit, Compute capability , CuDNN 7.6.5 for Cuda 10.0)

https://www.tensorflow.org/install/ https://www.tensorflow.org/install/gpu

So Ampere (the Geforce RTX 3000 parts) needs CuDNN 8.x to make use of new features on those cards. cuDNN 8.x still supports Kepler (Geforce GTX 780 or higher) parts, but only specific models with compute capability 3.5+ . Geforce GTX 10xx parts are 6.1 and RTX 20xx parts are 7.5.

 

Confused enough yet? Basically any GTX 780 part or better is still usable for Cuda Neural Net stuff, but the libraries used by still in the last year pretty much require a GTX 10xx or 20xx card.

 

Which means that if you're doing this kind of stuff anyway, any working 10xx or 20xx part is interchangeable (eg xx80 with another xx80 part.)

 

Still has pigtails.  I’m not knocking your avatar I’m knocking my brain. 
 

I get confused by pigtails and you think I won’t get confused by that?
 

The saddest part for me is I’m pretty sure neither party actually has pigtails. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, AndreiArgeanu said:

You're saying that as if 99% of people have 700$ to spend on a graphics card. Generally speaking less than 1% of people who are buying GPU's will be getting the 3080, or just spend that much money on a new GPU. There's a reason to why the top 5 most popular GPU's in the steam hardware survey are occupied by cheaper gpu's mostly under 400$, and not 2080's and 2080 ti's.

And I don't understand what's with the empathy part, ain't no one crying or dying over the fact that there's may be a chance for GPU shortages, this ain't no tragedy, there's no need for empathy. If you can't take a joke move on and don't be butthurt about it.

image.png.bf92bd78d196f049c39efa37e8227f6a.png

image.png.ecb4e02522a81e7cd76cded85c12103a.png

I don't know how accurate the steam hardware survey is, but I'm pretty sure it's a voluntary survey, although I don't doubt most people are buying cards in the $200-300 price range, not in the $700 budget as you can just build a whole PC for a $700, or get a console and a TV for that much. And people with $700 sitting around should have another hobby or a console they can play games on if they sold their GPU.

2 minutes ago, descendency said:

What happens when the mining market dies and the PC gaming community just migrates to consoles because they're fed up with $1500 midtier cards? 

I wonder how many people are fed up with Nvidia increasing prices since the RTX 2000 series, the x80 card should be $550-600 not "starting at" $700, with AIB models being $50-100 more.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Blademaster91 said:

I don't know how accurate the steam hardware survey is, but I'm pretty sure it's a voluntary survey, although I don't doubt most people are buying cards in the $200-300 price range, not in the $700 budget as you can just build a whole PC for a $700, or get a console and a TV for that much. And people with $700 sitting around should have another hobby or a console they can play games on if they sold their GPU.

I wonder how many people are fed up with Nvidia increasing prices since the RTX 2000 series, the x80 card should be $550-600 not "starting at" $700, with AIB models being $50-100 more.

One thing I don’t know is what actual costs are.  Research and development is going to be insanely high, and has to be added to manufacturing cost, but those end year margins still look pretty dang healthy. Corpulent even.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, AndreiArgeanu said:

You're saying that as if 99% of people have 700$ to spend on a graphics card. Generally speaking less than 1% of people who are buying GPU's will be getting the 3080, or just spend that much money on a new GPU. There's a reason to why the top 5 most popular GPU's in the steam hardware survey are occupied by cheaper gpu's mostly under 400$, and not 2080's and 2080 ti's.

And I don't understand what's with the empathy part, ain't no one crying or dying over the fact that there's may be a chance for GPU shortages, this ain't no tragedy, there's no need for empathy. If you can't take a joke move on and don't be butthurt about it.

image.png.bf92bd78d196f049c39efa37e8227f6a.png

image.png.ecb4e02522a81e7cd76cded85c12103a.png

The 2000 series had relatively very low sales from PC gamers because the 2000 series was a total rip-off, price-to-performance-wise.

 

The overwhelming majority of 1000 series owners didn't upgrade to the 2000 because there wasn't an attractive proposition. Because the 3000 series is offering a huge increase in performance, and because 1000 series owners' cards are getting a bit dated now, sales figures for the 3000 series will likely be far higher than the 2000 series'.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CarlBar said:

Nothing, most of their money comes from selling qudro's to servers, desktop is a side business. It's the same with AMD only more so since they're also in consoles which are the second biggest DGPU market after servers.

Actually, consumer gaming is 50% of NVIDIA sales, as per their latest report.

 

For AMD, the personal computer market is over 70% of their revenue.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, igormp said:

Actually, consumer gaming is 50% of NVIDIA sales, as per their latest report.

 

For AMD, the personal computer market is over 70% of their revenue.

 

Is that by volume or by profit though? Server parts tend to be very high margin, but as @leadeater pointed out to me recently not high volume.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×