Jump to content

Nvidia might delay RTX 40 series on account of a 30 series market flood

Rym
4 hours ago, Poinkachu said:

Haha, if the community capable of pulling through with this kinda movement. Pretty sure Nvidia or other pc part company won't be able to pull much shit move.

Even during GPUpocalypse some people still buy anyway, even if it's at jacked up price.

 

The absolute worst version of the rumors floating around - Nvidia hoarding warehouses full of GPU's to artificially drive up prices, etc. - could be proven true and gamers would still crawl over each other to buy the next gen cards because over 9000% more ray tracing jiggawatts or whatever. 

 

Do I need to dig up that screenshot of everybody in the "Boycott Modern Warfare 2" Steam group from back in the day?

 

Gamers have no morals, self-respect, or spine. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Middcore said:

Gamers have no morals, self-respect, or spine. 

They do until it inconveniences them.

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Rym said:

 

 

Summary

 Due to an oversupply of used RTX 30 series GPUs, as well as board partners and retailers having too many GPUs, Nvidia is considering delaying the release of the RTX 40 series GPUs.

 

Quotes

 

My thoughts

 Well, how do I put this gently? 

 

That is entirely NOT my problem, Nvidia knew they were selling everything they had to miners, I called out mining 2 years ago as being a short term thing and it will come crashing down while I ignored the naysayers. All those used GPUs would obviously find their way on the used market, driving the prices down. I thought about getting an RTX 4080-4090 but now I'm also hearing Nvidia trying to cause an artificial shortage to drive the prices of their new GPUs up. Honestly I will now very likely go for an AMD RX 7900 XT especially since it's looking to release earlier than Nvidia gpus. I'm also hearing people will not be buying an Nvidia GPU due to pure spite of what they did for 2 years to their tried and true customers. AMD on the other hand doesn't have this oversupply issue since they never made that many GPUs for miners to begin with and are looking to release their GPUs very soon.

 

Sources

 https://www.pcgamesn.com/nvidia/rtx-4000-gpu-launch-delay-geforce-3000-oversupply

That is hilarious. Good job Nvidia for selling to miners.

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, porina said:

For context, non-GAAP gross margin for some companies taken off official results where possible

You obviously know more than me on finance and so on, and I don't think that any of the companies that you mentioned above "have to" lower their profit, since they sell luxury items anyway. Also we can't really assume safely their profit because rnd cannot be accurate because it's divided by the amount of products that they will sell eventually (Total rnd cost/ amount of units sold), drivers support cost, also we don't know their fail rates and so on. 

My point is that it's kinda good for the end user that they make that much, because then they will spend more for the aforementioned things, and we will enjoy a stable product that is evolving really fast in comparison to the majority of the other products that we use daily

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, PeachGr said:

Also we can't really assume safely their profit because rnd cannot be accurate because it's divided by the amount of products that they will sell eventually (Total rnd cost/ amount of units sold), drivers support cost, also we don't know their fail rates and so on.

I was trying to provide context why that number is the way it is, not if those numbers are great or bad. I used gross margin because it is a commonly reported value in financials so easy to look up. I suppose what you might have been thinking of is an everything considered profit. I don't know if that has a proper name in financial speak. Like GM it'll be an average of all products, so might be be too meaningful in isolation. There's probably enough info in financial reports to work that out if someone were sufficiently determined to do so, but I'm not that person. I'm not expert in these reports either, but some of it rubbed off on my from my previous employment.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

I'm hearing a lot of people sort of talking like NVidia was basically gambling on crypto here. They didn't anticipate a crypto crash that everyone knew was very likely. 

 

The thing is though, anyone paying attention to crypto has known that mining was going away very soon. I know the meme is that they keep delaying the merge, but we've been pretty sure it was happening this year for at least six months now, and it's been the tentative plan for at least a year.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

I don't know if that has a proper name in financial speak. Like GM it'll be an average of all products, so might be be too meaningful in isolation.

ASP (Average Selling Price)? AMD talks about this in their quarterly reports, Nvidia does not. AMD however far as I can tell does not give actual figures just reports it's trend.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, PeachGr said:

if you were on 3D animation, you 'd know that everybody else is giving zero fucks for us, and Nvidia, as a monopoly, still provides big features for rendering, AI etc. So no, i will not crap on them for being expencive, i know they make about 60% profit per unit, but they still do things right and my world wouldnt be the same without them. Btw if i was a gamer only, i could have a different opinion, but untill then, Nvidia all the way


This goes to show how close-minded Nvidia wants you to be. I really hope you realize all those things starting with CUDA you use are proprietary / closed source. Gamers have been putting FSR 2.0 patches to games and it's been working flawlessly. Then you talk about 3D animation for a good part of which by the way, the cards benefit of with custom written drivers - and you, as a hobby 3D artist or whatever, will get the shaft of it in the worst ways possible (just look at the prices of high end Quadro cards).

In the HPC environment, AMD holds a decent ground with their CPUs and GPUs. Though every day as we speak here, people on Nvidia are laughing their way to the bank. They're the definition of an anti-consumer company. You can find many examples of this, such as their deals with buying out other companies, or being very unfunctional with 'Nix operating systems.

Mac OS conversions also work better on AMD. Bootcamp drivers as per se... you see, there are also many cases that AMD offers better alternatives.

Link to comment
Share on other sites

Link to post
Share on other sites

Gamers: Woe is me, there are no GPUs anywhere to buy.

Gamera: Woe is me, there's too many GPUs, and now i can't buy the newest shiny thing.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Arika S said:

Gamers: Woe is me, there are no GPUs anywhere to buy.

Gamera: Woe is me, there's too many GPUs, and now i can't buy the newest shiny thing.

Shortages of the last two years have made me realize that the bulk of gamers are the bulk of PCMR and I don't like being associated with them.

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, maartendc said:

Boy, that turned sour for them pretty fast...

 

At the start of 2022, they couldn't keep enough GPU's on shelves. Now, only 6 months later, there is an oversupply and they cannot even give them away.

 

It turned very quickly from:

- Miners buying everything to --> miners selling off their rigs.

- People wanting to spend money --> People holding off spending due to inflation

- People being drawn to 30 series --> people holding off for 40 series later this year.

 

I am seeing in my local market (EU) the first signs that miners are liquidating their GPU's. Sellers on second hand websites with 5+ GPU's for sale, some acknowledging that they were used for mining. 

 

At the same time new GPU's prices have fallen off a cliff, although still at or above MSRP in most cases. They will have to drop prices even further, to BELOW MSRP, if they are ever to sell all the stock they have left.

 

I am thinking of waiting a few more months, and then snapping up a dirt cheap used 20 or 30 series. Pricing will continue to fall off a cliff. Time to finally replace my old 980Ti.

Yeah, I’m definitely looking to replace my GTX 960 here. The $250 RX 6600 was very tempting, but as I don’t play a ton of games, I’m looking to keep it under $200. Just looking for a card to handle light games (Final Fantasy XII, Valkyria Chronicles, emulation, and a few others) at 1440P or 4K. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, maartendc said:

Boy, that turned sour for them pretty fast...

 

At the start of 2022, they couldn't keep enough GPU's on shelves. Now, only 6 months later, there is an oversupply and they cannot even give them away.

 

It turned very quickly from:

- Miners buying everything to --> miners selling off their rigs.

- People wanting to spend money --> People holding off spending due to inflation

- People being drawn to 30 series --> people holding off for 40 series later this year.

 

I am seeing in my local market (EU) the first signs that miners are liquidating their GPU's. Sellers on second hand websites with 5+ GPU's for sale, some acknowledging that they were used for mining. 

 

At the same time new GPU's prices have fallen off a cliff, although still at or above MSRP in most cases. They will have to drop prices even further, to BELOW MSRP, if they are ever to sell all the stock they have left.

 

I am thinking of waiting a few more months, and then snapping up a dirt cheap used 20 or 30 series. Pricing will continue to fall off a cliff. Time to finally replace my old 980Ti.

Yeah, I’m definitely looking to replace my GTX 960 here. The $250 RX 6600 was very tempting, but as I don’t play a ton of games, I’m looking to keep it under $200. Just looking for a card to handle light games (Final Fantasy XII, Valkyria Chronicles, emulation, and a few others) at 1440P or 4K. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Arika S said:

Gamers: Woe is me, there are no GPUs anywhere to buy.

Gamers: Woe is me, there's too many GPUs, and now i can't buy the newest shiny thing.

So as a consumer, I should disregard my personal interests, and the fact that buying a card right now may not be the best course of action for me?

Why should I spend my hard earned money now, on a product that I know for sure is going to be beat in price/performance in a few months? Call me a whiner if you want, it makes no effin' sense, not on a "gamer" PoV, but on a general consumer PoV.

Also, the company, be they nVidia, AMD or Intel, isn't going to give 2 shits about what's best for me, why should I give 2 shits about their oversupply issues?

 

Besides, this comes from MLID, so they might as well be reading the future on their horoscope.
And if nVidia does decide to delay, it might just give AMD an incentive to rush forward and take their new cards to market first, giving them a few months of """"""monopoly"""""". AMD could certainly do with a bump in their market share (assuming their cards are worth buying).

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, porina said:

The MSRPs of the new models were set to better reflect the changing realities of the world. Everything is going up. The Ampere launch day pricing is unlikely to ever return. Where I am, used 3080 GPUs are still selling for more than that on ebay even with the so called flood of mining cards. AMD are also affected, with 6800XT still above MSRP, keeping in mind there are more factors at play here such as AMD generally being less desirable at a given marketing tier.

 

Presuming you meant 2080Ti, the 3080 is significantly higher performance as it is the 3070 that goes against the 2080Ti. The old process used on Turing made it a bit of a monster to make hence the high cost. 

 

The 1630 serves a specific part of the market that is not performance gaming oriented. In that it does what it is set out to do. 

Typo, my bad, I actually meant the 3080Ti vs 3080 pricing. In my opinion the 3080 MSRP at $699 was "normal pricing" at the time. The 3080Ti released just 9 months after the 3080 for 70% price increase, for 13% performance increase. This is not just "inflation pricing". Inflation is like 8-10%. This was just Nvidia "Scalping" its own GPU's because of the shortage. It was Nvidia kicking themselves for selling the 3080 "too cheap", and course correcting by releasing a much more expensive card at essentially the same performance tier. They probably cut production of 3080s in favor of 3080Ti's, They also did the same when they released a 12GB model of the 3080, that they could sell for WAY more than the 10Gb model, at almost no performance increase. They probably cut production of the 10GB models so they could sell more of the 12GB model. I have seen anecdotal evidence of people who backordered a 3080 10GB back in 2020, and it NEVER came back in stock at the retailers. They just started coming out with newer, more expensive SKU's that they could profit more from.

 

This is just free market economics. But people who claim Nvidia "didn't benefit" from the shortage and inflated prices are just wrong.

 

I am not saying the 1630 should be gaming oriented, I know there is a market for these cards for HTPC's etc. I am saying the 1630 should be a sub $100 product. The 1030 was 79 USD. This is 200 USD for basically some video outs and a decoder? Gimme a break. In a "normal" market they couldn't get away with that, as people would just grab a used RX570 or something for $100 if they needed some video outs. The 1630 was intended as a pure cash grab by Nvidia. Hopefully nobody will buy it now.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, porina said:

Gross margin is the nearest thing we have to a profit measure, and simplified it is (revenue - cost of goods) / revenue. If you sell something that costs $1 to make at $2, that's 50% gross margin. Somewhere in the ball park of 50% is decent for a decent volume tech company. I would have expected Apple to be higher, but then again, maybe their products do cost more to make as a proportion of their selling price.

The high-end iphones have around a 50% profit margin after BoM and assembly cost, not counting any costs for R&D and paying a shitload of salaries.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Arika S said:

Gamers: Woe is me, there are no GPUs anywhere to buy.

Gamera: Woe is me, there's too many GPUs, and now i can't buy the newest shiny thing.

Gamers get mad at me when I keep using the "entitled" word, but then they keep proving me right, lol. These same people kept wishing for the death of mining because they "need" GPUs. Mining finally dies down, GPUs flood the market, now they are angry at the miners still because their inventory is preventing them from getting cards that don't even exist yet anyways. To make matters worse, half of these gamers don't even need a 4000 series card because they still use garbage monitors. A 3080 is quite decent at driving 4k 120hz with a solid framerate nowadays, even more so with DLSS being a thing. 

 

You will never be able to satisfy these people as nothing is ever enough. Release a sequel to a game that is too similar to the last game? Get roasted for lack of innovation. Release a sequel that is too different from the first game? Get roasted for deviating too hard from "the good game".

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/11/2022 at 11:26 AM, Motifator said:


I do own a 1030, but it's only been used for pretty much screen output. You can argue on the encode part, but the encoder on those cards is also pretty weaksauce and actually competes with CPUs, which are priced equally while serving a better over all purpose... people buy them mindlessly, not because they're feature packed or anything. For FPS, you could have bought Polaris cards or even Nvidia's own 1060 (which I also happen to have) back in the day for much lower prices.

Weighing the options on a rx 6400 low profile (which in itself is quite a bit faster than a 1030) I ended up just getting a new CPU for about the same cost... the 5600g is also faster than the 1030...  However I did end up finding a good deal on an open box 3060.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

Sources close to YouTuber Moore’s Law Is Dead

Stopped reading there.

That idiot "Moore's Law is Dead" is wrong more often than a broken clock, and whenever one of his terrible predictions that he likes to claim are rumors from reliable resources turns out t be wrong he always blames Nvidia.

 

The specs that he "leaked" were wrong? Nvidia leaked incorrect information on purpose to fool him!

The prices that he "leaked" were wrong? Nvidia changed the prices last minute just so that he would be incorrect!

The names he "leaked" were wrong? Nvidia changed the names last minute!

The performance numbers he "leaked" were wrong? Nvidia throttled the cards up until the release date just so that leaks would be incorrect!

 

 

You'd have to be an idiot to actually believe anything he says. It just makes shit up, says they are leaks, and when he is wrong he blames someone else, and when he is right he pretends like he is super reliable.

Remember when Moore's Law is Dead in 3 separate occasions said the RTX 30 series would be built on three different processing nodes? In one video he said it would be TSMC, in another he said Samsung 10nm, and in a third video he said Samsung 8nm. It turned out Samsung 8nm was correct so once that was confirmed he kept saying "as I leaked earlier" and "I was right", and never mentioned the two other guesses he got wrong.

 

 

Let me guess, Moore's Law is Dead had previously guessed (or as he likes to call guessing, "leaked") that the 40 series cards would launch in October. Now new rumors are suggesting that his guess was wrong, so instead of admitting that he is full of shit he will pretend like Nvidia are the bad guys and are postponing the launch.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, MageTank said:

To make matters worse, half of these gamers don't even need a 4000 series card because they still use garbage monitors. A 3080 is quite decent at driving 4k 120hz with a solid framerate nowadays, even more so with DLSS being a thing. 

No you don't understand, I need 600 fps instead of 500 fps in csgo on my 1080p 75hz monitor that I run in 4:3 because the internet told me it's what I need to become a world famous esports player. My gpu is the only thing stopping team liquid from begging at my feet for me to join them.

 

 

Or something, I stopped listening to people like this. 

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/11/2022 at 9:30 PM, Middcore said:

Gamers have no morals, self-respect, or spine. 

On 7/11/2022 at 9:49 PM, IkeaGnome said:

They do until it inconveniences them.

On 7/12/2022 at 2:47 AM, Arika S said:

Gamers: Woe is me, there are no GPUs anywhere to buy.

Gamera: Woe is me, there's too many GPUs, and now i can't buy the newest shiny thing.

15 hours ago, MageTank said:

Gamers get mad at me when I keep using the "entitled" word, but then they keep proving me right, lol.

 

Wow.

 

Thank you so much for making others feel welcome here. I think it's the first time in many years I feel insulted in a forum.

Overgeneralizing is not a nice thing to do.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, MageTank said:

To make matters worse, half of these gamers don't even need a 4000 series card because they still use garbage monitors.

I think you need to check your sources again, last I checked it was more like 57% with certified Garbage monitors (TM).

 

But it's nice to see the LTT-Forum Mining-Circlejerk once again coming together, reassuring each other how great of an idea mining is and how collectively stupid gamers are.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Dracarris said:

But it's nice to see the LTT-Forum Mining-Circlejerk once again coming together, reassuring each other how great of an idea mining is and how collectively stupid gamers are.

???????????

 

i don't think anyone in this thread has said that mining is a great idea.

 

You can think gamers are stupid without being pro-mining

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Rauten said:

 

Wow.

 

Thank you so much for making others feel welcome here. I think it's the first time in many years I feel insulted in a forum.

Overgeneralizing is not a nice thing to do.

Do you really identify as one of those entitled gamers which need the 4000 series tomorrow and won't stand a company delaying their new toy while thinking that any other use case other than games is stupid? If so, I have bad news for you.

 

Otherwise, I don't see why you feel insulted.

 

On 7/11/2022 at 9:46 PM, Motifator said:


This goes to show how close-minded Nvidia wants you to be. I really hope you realize all those things starting with CUDA you use are proprietary / closed source. Gamers have been putting FSR 2.0 patches to games and it's been working flawlessly. Then you talk about 3D animation for a good part of which by the way, the cards benefit of with custom written drivers - and you, as a hobby 3D artist or whatever, will get the shaft of it in the worst ways possible (just look at the prices of high end Quadro cards).

In the HPC environment, AMD holds a decent ground with their CPUs and GPUs. Though every day as we speak here, people on Nvidia are laughing their way to the bank. They're the definition of an anti-consumer company. You can find many examples of this, such as their deals with buying out other companies, or being very unfunctional with 'Nix operating systems.

Mac OS conversions also work better on AMD. Bootcamp drivers as per se... you see, there are also many cases that AMD offers better alternatives.

They are a company that holds a monopoly in certain areas because they have no competition whatsoever, such as in the ML space, and it seems like other companies aren't even trying to catch up.

 

So yeah, they do perform some shitty practices, but for their end consumer they provide something that works and allow many people to make money out of it. I'm not willing to spend my money in a company that doesn't properly support my needs, or does it in a half-assed way, and you can believe that I really tried to make my work stuff work with AMD to no avail.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, igormp said:

Do you really identify as one of those entitled gamers which need the 4000 series tomorrow and won't stand a company delaying their new toy while thinking that any other use case other than games is stupid? If so, I have bad news for you.

 

Otherwise, I don't see why you feel insulted.

There were no distinctions in those posts, they referred to "gamers" in general.

And since my main hobby is gaming, I do consider myself a gamer.

 

I mean, just look at the post before yours:

34 minutes ago, Arika S said:

You can think gamers are stupid without being pro-mining

Lovely.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×