Jump to content

You thought GTX 1060 was confusing? You've seen nothing. Enter RTX 2060 and 6 variants of the same card

Bouzoo
25 minutes ago, Master Disaster said:

Got a source on this. I've followed CEMU from pretty much day one, I've even had conversations with the developer (back before he left GBATemp) and I've never heard anyone ever suggest you run the emulator through WINE on Linux instead of Windows. Adding an extra layer of complexity to an emulator is never a good idea plus AMDs Linux drivers are terrible so why anyone would suggest running anything AMD based in Linux over Windows is beyond me.

 

OK so I did some googling before posting this and it turns out your partially correct. Using some custom AMD drivers on Linux its possible to get upto a 40% speed boost on CEMU but only if your GPU is weak and causing a bottleneck, if your GPU isn't bottlenecking then at best you see no difference. That said I can find nowhere the CEMU developer recommending Linux over Windows, the origins of this method come from Reddit.

Not sure if a i7-4790k bottlenecks a R9 390 but I had over 45 FPS in Linux while I was hardly getting 20 in Windows. It's not just about a bottleneck, Linux's AMD drivers are open-source so their OpenGL implementation is better overall.

 

The drivers I'm talking about are the Mesa ones. Not official.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, huilun02 said:

Of course this comes from non other than the same company that made GPP

And the 1050 and 1060 lineup

Link to comment
Share on other sites

Link to post
Share on other sites

I see jensen is trying to give Intel a run for it's money on the sku count huh?

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

I like the way every complains before we even know if any difference in the GPU's is just the memory controller or not.

 

More options aren't intrinsically evil unless they are intentionally obfuscating details and charging more. 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, mr moose said:

I like the way every complains before we even know if any difference in the GPU's is just the memory controller or not.

 

More options aren't intrinsically evil unless they are intentionally obfuscating details and charging more. 

I can't see why that many are necessary, having a few sure but 3 different models with the same memory type isn't going to reduce the cost that much so any cost difference is mostly going to be artificial or minimal so then why should it exist.

 

The GDDR5 vs GDDR6 models make sense as GDDR6 is obviously going to be more expensive but this is silly:

15 hours ago, NunoLava1998 said:
  • RTX 2060 GDDR5X 3GB
  • RTX 2060 GDDR5X 4GB
  • RTX 2060 GDDR5X 6GB
  • RTX 2060 GDDR6 3GB
  • RTX 2060 GDDR6 4GB
  • RTX 2060 GDDR6 6GB

 

I question the need for two 6GB variants but I guess having a cheaper higher end option is nice if the GDDR6 model is going to be that much more. But 3GB and 4GB of both options.... dumb.

 

If Nvidia really wants that many here's a crazy idea RTX 2060 = GDDR5 and RTX 2060 Ti = GDDR6.

Link to comment
Share on other sites

Link to post
Share on other sites

2049762219_Squidwardfacesmash.gif.2324401c5bbf0cb4a62c8c050061d006.gif

Use this guide to fix text problems in your postGo here and here for all your power supply needs

 

New Build Currently Under Construction! See here!!!! -----> 

 

Spoiler

Deathwatch:[CPU I7 4790K @ 4.5GHz][RAM TEAM VULCAN 16 GB 1600][MB ASRock Z97 Anniversary][GPU XFX Radeon RX 480 8GB][STORAGE 250GB SAMSUNG EVO SSD Samsung 2TB HDD 2TB WD External Drive][COOLER Cooler Master Hyper 212 Evo][PSU Cooler Master 650M][Case Thermaltake Core V31]

Spoiler

Cupid:[CPU Core 2 Duo E8600 3.33GHz][RAM 3 GB DDR2][750GB Samsung 2.5" HDD/HDD Seagate 80GB SATA/Samsung 80GB IDE/WD 325GB IDE][MB Acer M1641][CASE Antec][[PSU Altec 425 Watt][GPU Radeon HD 4890 1GB][TP-Link 54MBps Wireless Card]

Spoiler

Carlile: [CPU 2x Pentium 3 1.4GHz][MB ASUS TR-DLS][RAM 2x 512MB DDR ECC Registered][GPU Nvidia TNT2 Pro][PSU Enermax][HDD 1 IDE 160GB, 4 SCSI 70GB][RAID CARD Dell Perc 3]

Spoiler

Zeonnight [CPU AMD Athlon x2 4400][GPU Sapphire Radeon 4650 1GB][RAM 2GB DDR2]

Spoiler

Server [CPU 2x Xeon L5630][PSU Dell Poweredge 850w][HDD 1 SATA 160GB, 3 SAS 146GB][RAID CARD Dell Perc 6i]

Spoiler

Kero [CPU Pentium 1 133Mhz] [GPU Cirrus Logic LCD 1MB Graphics Controller] [Ram 48MB ][HDD 1.4GB Hitachi IDE]

Spoiler

Mining Rig: [CPU Athlon 64 X2 4400+][GPUS 9 RX 560s, 2 RX 570][HDD 160GB something][RAM 8GBs DDR3][PSUs 1 Thermaltake 700w, 2 Delta 900w 120v Server modded]

RAINBOWS!!!

 

 QUOTE ME SO I CAN SEE YOUR REPLYS!!!!

Link to comment
Share on other sites

Link to post
Share on other sites

Where are the 3.5GB versions? 0/10 wouldn't get it

ASUS X470-PRO • R7 1700 4GHz • Corsair H110i GT P/P • 2x MSI RX 480 8G • Corsair DP 2x8 @3466 • EVGA 750 G2 • Corsair 730T • Crucial MX500 250GB • WD 4TB

Link to comment
Share on other sites

Link to post
Share on other sites

Holy balls nVidia. I was already getting confused when you used to rebrand your last gen higher tier chips as budget ones for the next gen. Now this!? Thank god I haven't bought an nVidia card since the GTS 450

Brands I wholeheartedly reccomend (though do have flawed products): Apple, Razer, Corsair, Asus, Gigabyte, bequiet!, Noctua, Fractal, GSkill (RAM only)

Wall Of Fame (Informative people/People I like): @Glenwing @DrMacintosh @Schnoz @TempestCatto @LogicalDrm @Dan Castellaneta

Useful threads: 

How To Make Your Own Cloud Storage

Spoiler

 

Guide to Display Cables/Adapters

Spoiler

 

PSU Tier List (Latest)-

Spoiler

 

 

Main PC: See spoiler tag

Laptop: 2020 iPad Pro 12.9" with Magic Keyboard

Spoiler

PCPartPicker Part List: https://pcpartpicker.com/list/gKh8zN

CPU: AMD Ryzen 9 3900X 3.8 GHz 12-Core OEM/Tray Processor  (Purchased For $419.99) 
Motherboard: Asus ROG Crosshair VIII Formula ATX AM4 Motherboard  (Purchased For $356.99) 
Memory: G.Skill Trident Z RGB 32 GB (2 x 16 GB) DDR4-3000 Memory  (Purchased For $130.00) 
Storage: Kingston Predator 240 GB M.2-2280 NVME Solid State Drive  (Purchased For $40.00) 
Storage: Crucial MX300 1.05 TB 2.5" Solid State Drive  (Purchased For $100.00) 
Storage: Western Digital Red 8 TB 3.5" 5400RPM Internal Hard Drive  (Purchased For $180.00) 
Video Card: Gigabyte GeForce RTX 2070 8 GB WINDFORCE Video Card  (Purchased For $370.00) 
Case: Fractal Design Define R6 USB-C ATX Mid Tower Case  (Purchased For $100.00) 
Power Supply: Corsair RMi 1000 W 80+ Gold Certified Fully Modular ATX Power Supply  (Purchased For $120.00) 
Optical Drive: Asus DRW-24B1ST/BLK/B/AS DVD/CD Writer  (Purchased For $75.00) 
Total: $1891.98
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2020-04-02 19:59 EDT-0400

身のなわたしはる果てぞ  悲しわたしはかりけるわたしは

Link to comment
Share on other sites

Link to post
Share on other sites

It's pretty much a shit show by now.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

I can't see why that many are necessary, having a few sure but 3 different models with the same memory type isn't going to reduce the cost that much so any cost difference is mostly going to be artificial or minimal so then why should it exist.

 

The GDDR5 vs GDDR6 models make sense as GDDR6 is obviously going to be more expensive but this is silly:

 

I question the need for two 6GB variants but I guess having a cheaper higher end option is nice if the GDDR6 model is going to be that much more. But 3GB and 4GB of both options.... dumb.

 

Again, if there is no difference other than the amount/type of ram and that works out to give more options then why is it such an issue? 

Quote

If Nvidia really wants that many here's a crazy idea RTX 2060 = GDDR5 and RTX 2060 Ti = GDDR6.

 

But there is a GPU difference outside of ram on the Ti model.   

 

So long as the products are clearly marked and they aren't changing the number of cores like they did with the 1060 then I don't see it as a problem.

 

 

EDIT: and besides that, people are so busy looking for something to dump on they haven't even started arguing that the 2060 isn't big enough to actually make any use of the RTX:

aDxK7AB_700b.jpg.2469e15c730b2419e2e04eea08220a05.jpg

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

Again, if there is no difference other than the amount/type of ram and that works out to give more options then why is it such an issue? 

I think the real issue is whether there will be other differences besides VRAM count. 

 

It should be made clear in order to avoid confusion. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

But there is a GPU difference outside of ram on the Ti model.  

And so was there with the 1060 3GB and 1060GB and so will there actually be for the 3GB and 4GB models, not a lot I hope but for sure the memory bus width because it's actually impossible there won't be so there is going to be memory bandwidth difference even within the same GDDR type used.

 

1 hour ago, mr moose said:

Again, if there is no difference other than the amount/type of ram and that works out to give more options then why is it such an issue? 

Yes there is a difference. "Which one should I pick, there's too many options and I don't understand them all". Was a lot easier to explain the 1060 3GB and 1060 6GB even with the slight GPU core difference.

 

More options are not better, options that don't make any difference is even worse. More options that inflates other products that would other wise not have been is terrible. BOM cost wise the 3GB and 4GB are doing to be minute, what we pay will controlled to give meaning to those two different models so there won't be a $10 difference which even then is likely more than the parts cost difference between 3GB and 4GB.

 

Never confuse or assume more options with being better, first you must show that having more is better to say it's better.

Link to comment
Share on other sites

Link to post
Share on other sites

they been allowing multiple models on x60 variants since 200series

192 core vs 216 core 896mb vs 1.7gb

and I had 1.7gb 216 core versions which was nice

 

thinking about it I think they had multiple versions of 8600 too

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, leadeater said:

Never confuse or assume more options with being better, first you must show that having more is better to say it's better.

I am not the one making assumptions here. No ones knows what the difference is other than memory.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, pas008 said:

they been allowing multiple models on x60 variants since 200series

192 core vs 216 core 896mb vs 1.7gb

and I had 1.7gb 216 core versions which was nice

 

thinking about it I think they had multiple versions of 8600 too

True there's often been more than one but the issue isn't that there is, it's that there is 6 and that's never been the case. You don't need 6 variants of the same GPU, we have model numbers so if there actually is a meaningful difference between those 6 options then use model numbers not sub naming. If there isn't a meaningful difference between the 6 then how many is actually needed? 2? 3? 4?

 

This is a mid range GPU correct, if GDDR6 isn't actually going to have a performance difference then why put it on the cards which increases the cost. It would make sense to use it on GPU configs that need the memory bandwidth but then you're moving away from variants to actual GPU core difference so should be using model numbers.

 

Nvidia is likely using the 2060 to increase the volume of GDDR6 to lower the cost so it makes business sense there, though with the GDDR5 options existing that will/should be cheaper and if that doesn't have a performance impact I doubt many are going to pay more for nothing.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, mr moose said:

I am not the one making assumptions here. No ones knows what the difference is other than memory.

It is literally impossible to have the same memory bus width with 3GB and 4GB options, that's not an assumption. You can know that from the size of memory chips available and how many are required to make up 3GB and 4GB.

 

Edit:

So it's not an assumption to know the that there is a BOM cost difference, it's not an assumption to generally know what the BOM cost of the memory chips are, and it's not that much of an assumption to know how that translates to a product price and how that difference would normally be small enough that to bother would result in price padding to differentiate the SKUs (I'm not saying that's bad unless you have 6 of the same basic thing).

 

So we are going to have 6 different RTX 2060 variants on the market, two different memory types, two different bus widths (minimum) even before knowing if there are GPU core differences or not. It's unnecessary under the same RTX 2060 model name.

 

It's not an argument against these 6 variants existing at all but to carry the same RTX 2060 model number.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

True there's often been more than one but the issue isn't that there is, it's that there is 6 and that's never been the case. You don't need 6 variants of the same GPU, we have model numbers so if there actually is a meaningful difference between those 6 options then use model numbers not sub naming. If there isn't a meaningful difference between the 6 then how many is actually needed? 2? 3? 4?

 

This is a mid range GPU correct, if GDDR6 isn't actually going to have a performance difference then why put it on the cards which increases the cost. It would make sense to use it on GPU configs that need the memory bandwidth but then you're moving away from variants to actual GPU core difference so should be using model numbers.

 

Nvidia is likely using the 2060 to increase the volume of GDDR6 to lower the cost so it makes business sense there, though with the GDDR5 options existing that will/should be cheaper and if that doesn't have a performance impact I doubt many are going to pay more for nothing.

i'm not a market analyzer

I dont think you are either

but having 3 5 and 6gb models prollly paid off well for 1060

and we havent even dug into oem models or even considered mobile where many small variables can make huge difference on battery

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, leadeater said:

It is literally impossible to have the same memory bus width with 3GB and 4GB options, that's not an assumption. You can know that from the size of memory chips available and how many are required to make up 3GB and 4GB.

 

That's memory, I specifically said:

 

15 minutes ago, mr moose said:

I am not the one making assumptions here. No ones knows what the difference is other than memory.

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, pas008 said:

i'm not a market analyzer

I dont think you are either

but having 3 5 and 6gb models prollly paid off well for 1060

and we havent even dug into oem models or even considered mobile where many small variables can make huge difference on battery

The 5GB came later and from memory was not a global SKU. Only the variant with different amounts of memory chips or using different memory types would effect battery life and you can have mobile only SKUs, you don't have to skater shot the desktop market to solve a mobile platform problem.

 

And anyone can analyze the market, it doesn't have to be your literal job to be able to or be allowed to. But that's not even the problem or the point of view I'm looking at, that's the buyer wondering if they need 3GB or 4GB, should you pay more, what's the difference between GDDR5 and GDDR6. Good luck explaining all that even with benchmarks with data you can refer to.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, mr moose said:

That's memory, I specifically said:

 

33 minutes ago, leadeater said:

Never confuse or assume more options with being better, first you must show that having more is better to say it's better.

 

Either you are assuming more options are good or you are ignoring how multiple variants of the same card can be confusing for the consumer/buyer.

 

I'm not assuming more is bad, I know more are bad if they don't actually make a difference or inflate prices to accommodate their existence. If there is an actual performance difference between them then using the same model number is again confusing to the consumer/buyer and therefore bad.

 

Critical thinking applied moves the likelihood away from more options being good for the consumer/buyer to being worse.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×