Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Reviews are already circulating on Youtube. I am very much interested in hearing your opinions:

 

GamersNexus

 

Spoiler

 

 

JayzTwoCents

 

Spoiler

 

 

LinusTechTips

 

Spoiler

 

 

HardwareUnboxed

 

Spoiler

 

 

Personal Opinion:

 

I think Linus missed the mark on this one big time. I was almost tempted to think that the review was paid by Nvidia (as noted by Spotty@, the review was made a bit too early so I retract what I previously said for the time-being), but I wouldn't go that far as Linus has a pretty good track record as a tech reviewer and throwing around silly accusations hurts the conversation more than anything. I agree with Steve (GN) and the rest. It is overpriced and feels like Nvidia is tone deaf. What do you guys think?

 

- Edit 2: Retraction retracted, Linus doubled down on the newest Wan show (06/05/2021):

 

- Edit 3: Linus to Steve from GN "I don't know how much used hardware Steve has actually ever bought in his life, but the way that it works is....... (goes on to explain how it works in his view) I really don't know what he is basing that on honestly" (Time stamp: 16:19)

 

- Edit 4: "If you had to calculate how many FPS per dollar, you are not the customer for a 3080 Ti in the first place. So why are you Mad? What's the point of getting mad?"  (Time stamp: 22:41)

 

- Edit 5: "If you're someone who is not in that price range and you're willing to scalp it, then (pauses for a bit) you'd be crazy not to buy it." Luke is visibly uncomfortable with the statement (more so I think with the one right before it). Linus goes on to re-clarify his previous position about scalping, which is that scalping is bad and he does not support it. (time stamp: 26:04)

 

 

Link to post
Share on other sites

I think there is a sucker born every minute and Nvidia knows this 🙂

 

 

Emma : i9 9900K 5.1Ghz All-Core - Gigabyte AORUS Z370 Gaming 5 - Tt Water 3.0 Ultimate 360mm AIO - G. Skill Ripjaws V 32GB 3200Mhz - Gigabyte AORUS 1080Ti - 750 EVO 512GB + 2x 860 EVO 1TB M.2 (RAID 0) - EVGA Supernova 650 P2 - Fractal Design Define R6 - AOC AGON 35" 3440x1440 100Hz - Mackie CR5BT - Logitech G910, G502, G933 - Cooler Master Universal Graphics Card Holder

 

Plex : Ryzen 5 1600 (stock) - Gigabyte B450M-DS3H - G. Skill Ripjaws V 16GB 2400Mhz - MSI 1050Ti 4GB - 840 EVO 256GB + Toshiba P300 3TB - TP-Link AC1900 PCIe Wifi - Cooler Master MasterWatt Lite 500 - Antec Nine Hundred - Dell 24" 

 

Lenovo 720S Touch 15.6" - i7 7700HQ, 16GB RAM, 512GB NVMe SSD, 1050Ti, 4K touchscreen

 

MSI GF62 - i7 7700HQ, 16GB 2400 MHz RAM, 256GB NVMe SSD + 1TB 7200rpm HDD, 1050Ti

 

Other Tech - 2021 Volvo S60 Recharge T8 Polestar Engineered - 415hp/495tq 2.0L 4cyl. turbocharged, supercharged and electrified.

Link to post
Share on other sites

I pretty much agree most with this:

Really hit the nail on the head with that one.

 

I bet the fact that LTT is the one outlier in the conclision makes all the tinfoil hatters out there really happy.

Dislaimer: If my post is older than 5 minutes, refresh the page. Most of my posts get edited straight away. 😄

 

PSU Tier List          AMD Motherboard Tier List          SSD Tier List

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to post
Share on other sites

AMD reference cards are very easy to get. Look at how few GPU dies they make compared to NVIDIA. They just look desperate at this point.

I'm not actually trying to be as grumpy as it seems.

Project Hot Box

CPU Ryzen 7 5800x, Motherboard MSI MPG B550I Gaming Edge Wifi, RAM CORSAIR Vengeance LPX 32GB 3200mhz (2x16), GPU AMD Raedon 6800 XT FE, Case Silverstone RVZ03B, Storage CORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR sf600, Cooling Noctua NH-L12S, Displays LG 34UC98-W 34-Inch, 32 inch curved that I can't find the model number for right now; UPerfect 7 inch LCD Aida64 Display, Keyboard Corsair K95, Mouse Corsair Nightblade, Sound AT2020+ USB Mic, Massdrop 6xx headphones, Schiit Magni, Modi Loki mini stack. 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPU Asus ROG Strix 2070 8gb, MSI 1070 Gaming X 8gb, Case Phanteks Enthoo Evolv X, Storage Samsung 860 Evo 500 GB, 5x Seagate IronWolf 8tb NAS(ZFS1), PSU Corsair RM1000x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans, OS Unraid

 

Why is the 5800x so hot?

 

 

Link to post
Share on other sites

It comes dome to that meme again.

nVidia the way games are meant to be played

&

AMD the way games can be played

Link to post
Share on other sites

So as I expected literally a 3090 with less vram and a teeny tiny clock hit. All whilst if you take the average of games tested the 6900xt seems to be the better buy as raytracing still hasn't taken off yet in games that makes is a must have feature. For games that is.

Link to post
Share on other sites
3 minutes ago, rockyroller said:

It comes dome to that meme again.

nVidia the way games are meant to be played

&

AMD the way games can be played

Well you can't play games with any of them when you want to buy something right now...

Dislaimer: If my post is older than 5 minutes, refresh the page. Most of my posts get edited straight away. 😄

 

PSU Tier List          AMD Motherboard Tier List          SSD Tier List

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to post
Share on other sites
Just now, jaslion said:

So as I expected literally a 3090 with less vram and a teeny tiny clock hit. All whilst if you take the average of games tested the 6900xt seems to be the better buy as raytracing still hasn't taken off yet in games that makes is a must have feature. For games that is.

All the recent games i've played in the last few weeks are mainly ray-traced.

 

Metro Exodus Enhanced

Resident Evil Village

Control

Cyberpunk

 

I'd say we're at a point now where at least for AAA games it's becoming a standard.

Dislaimer: If my post is older than 5 minutes, refresh the page. Most of my posts get edited straight away. 😄

 

PSU Tier List          AMD Motherboard Tier List          SSD Tier List

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to post
Share on other sites
25 minutes ago, Deus Voltage said:

I think Linus missed the mark on this one big time. I was almost tempted to think that the review was paid by Nvidia, but I wouldn't go that far as Linus has a pretty good track record as a tech reviewer and throwing around silly accusations hurts the conversation more than anything. I agree with Steve (GN) and the rest. It is overpriced and feels like Nvidia is tone deaf. What do you guys think?

LTT filmed their video before the price was released and their video suffered for it. They should have waited until they had all the information about the product before reviewing it.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to post
Share on other sites
Just now, Stahlmann said:

All the recent games i've played in the last few weeks are mainly ray-traced.

 

Metro Exodus Enhanced

Resident Evil Village

Control

Cyberpunk

 

I'd say we're at a point now where at least for AAA games it's becoming a standard.

It is there but I meant as a noticeable visual difference. We've played both control and are now playing village. We just can't see a difference between turned on or off really when we play them and tend to turn it off if we notice the 2080 struggling when stuff goes down if it is on to basically notice no difference when playing.

 

That is what I meant with not being a must have feature. It is there but there is no real need for it yet as devs got so good at faking things. I do love it for rendering tho as that is a massive help hence the for games part.

Link to post
Share on other sites

3070 Ti is way reasonable that what the 3080 ti is.
however as everyone could point out, those cards aren't even going to save the "GPU Shortage" situation and it's only worse, at this point we can convincedly say that Nvidia just doesn't care. AMD did the obvious and did not launch the 6600 Series GPU (except mobile) nor all the APUs because of the chip shortage and if Nvidia wanted they could go to a similar direction.

Make sure to quote me if you want me to respond
Thanks :)

Turn your Mobile VR or PSVR Headset into a working 6DoF SteamVR one guide/tutorial (below):

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

My Future PC Build for gaming and sometimes streaming 

Spoiler

https://de.pcpartpicker.com/list/GXzknL

 

CPU: Ryzen 5 5600X

 

Motherboard: MSI B450 Gaming Plus MAX/B450M Mortar Max

 

RAM: Corsair Vengeance LPX 16 GB (2x8GB) DDR4-3600 CL18

 

Storage: Western Digital Blue SN550 1 TB NVME SSD

 

GPU: Nvidia RTX 3060

 

Case: MSI MAG FORGE 100R

 

PSU: EVGA BQ 600W

 

Link to post
Share on other sites
1 minute ago, Spotty said:

LTT filmed their video before the price was released and their video suffered for it. They should have waited until they had all the information about the product before reviewing it.

Not even that, they could have even filmed the conclusion after the announcement rather than 'fixing it in post'. I get Linus is a busy guy, but I'm sure it wouldn't have been too difficult to spare half an hour to sit down and say a few more words!

 

They really do seem to have gone downhill in their reviews, there's a reason they aren't my 'go to'!

Laptop:

Spoiler

HP OMEN 15 - Intel Core i7 9750H, 16GB DDR4, 512GB NVMe SSD, Nvidia RTX 2060, 15.6" 1080p 144Hz IPS display

PC:

Spoiler

Vacancy - Looking for applicants, please send CV

Mac:

Spoiler

2009 Mac Pro 8 Core - 2 x Xeon E5520, 16GB DDR3 1333 ECC, 120GB SATA SSD, AMD Radeon 7850. Soon to be upgraded to 2 x 6 Core Xeons

Phones:

Spoiler

LG G6 - Platinum (The best colour of any phone, period)

LG G7 - Moroccan Blue

 

Link to post
Share on other sites
5 minutes ago, jaslion said:

It is there but I meant as a noticeable visual difference. We've played both control and are now playing village. We just can't see a difference between turned on or off really when we play them and tend to turn it off if we notice the 2080 struggling when stuff goes down if it is on to basically notice no difference when playing.

 

That is what I meant with not being a must have feature. It is there but there is no real need for it yet as devs got so good at faking things. I do love it for rendering tho as that is a massive help hence the for games part.

I never tried Resident Evil, Control or Metro with RTX off because i have no reason to turn it off. Performance even on 4K is plenty for me. So i can't really say how these games look with traditional rendering.

 

But in Cyberpunk i played around a lot with the different graphics settings. And the difference between RTX on and off is massive imo.

 

While it may not be a must-have right now, it's moving into the direction of a standard.

Dislaimer: If my post is older than 5 minutes, refresh the page. Most of my posts get edited straight away. 😄

 

PSU Tier List          AMD Motherboard Tier List          SSD Tier List

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to post
Share on other sites
2 minutes ago, Stahlmann said:

I never tried Resident Evil, Control or Metro with RTX off because i have no reason to turn it off. Performance even on 4K is plenty for me. So i can't really say how these games look with traditional rendering.

 

But in Cyberpunk i played around a lot with the different graphics settings. And the difference between RTX on and off is massive imo.

 

While it may not be a must-have right now, it's moving into the direction of a standard.

I'm not against it at all if that is the vibe I gave off. I'm more of it really isn't a dealbreaker at all right now due to good fake shaders that and most gamers do not have access to raytracing capable hardware right now.

 

Cybperpunk I see as an outlier as that game was rushed so I wouldn't be too suprised that raytracing was the cheap and quickest way to get the game to look better instead of using some pretty crap shaders that are used when not using rt. Of the actual finished games there is little to no difference when standing still and during gameplay it just fades away.

 

To me the most noteable way I've seen ray tracing is in spidermand miles morales as you can still see it happening whilst playing. It only does reflections in the big glass skyscrapers but that really does create some cool shots and moments that would not be possible otherwise.

Link to post
Share on other sites
1 minute ago, jaslion said:

Cybperpunk I see as an outlier as that game was rushed so I wouldn't be too suprised that raytracing was the cheap and quickest way to get the game to look better instead of using some pretty crap shaders that are used when not using rt. Of the actual finished games there is little to no difference when standing still and during gameplay it just fades away.

Cyberpunk still looks awesome without ray-tracing. I'm just saying the game has the most noticeable improvement.

 

The "fading" is very subjective. As ray-tracing is inherently more realistic than rasterized lighting it helps a lot with immersion.

Dislaimer: If my post is older than 5 minutes, refresh the page. Most of my posts get edited straight away. 😄

 

PSU Tier List          AMD Motherboard Tier List          SSD Tier List

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to post
Share on other sites
2 minutes ago, Stahlmann said:

Cyberpunk still looks awesome without ray-tracing. I'm just saying the game has the most noticeable improvement.

 

The "fading" is very subjective. As ray-tracing is inherently more realistic than rasterized lighting it helps a lot with immersion.

That is an entirely fair point. What I do see happening eventually is ray tracing being used like on the consoles. For a select couple of simple things instead of for big systems. That way the heavy stuff can be done like normal but the otherwise minute or hard to do things with shaders can be done on the ray tracing part of the gpu (like in spiderman window reflections) thus leveraging the best of the 2 systems.

Link to post
Share on other sites
18 minutes ago, yolosnail said:

They really do seem to have gone downhill in their reviews, there's a reason they aren't my 'go to'!

They've become a lot more "just for fun" than doing any serious reviewing imo. Their target audience is basically the people who are just getting into the industry. Anyone who has been here for some time will have likely already made their move to more serious reviewers like Gamers Nexus and Hardware Unboxed for "real" and "technical" reviews.

 

Best example is the latest MiniLED TV "review" where they got the samsung tv. They just basically sat down in front of it and said "this is fine". No measurements, etc... LTT is a lot about subjectivity these days.

Dislaimer: If my post is older than 5 minutes, refresh the page. Most of my posts get edited straight away. 😄

 

PSU Tier List          AMD Motherboard Tier List          SSD Tier List

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to post
Share on other sites
1 minute ago, jaslion said:

That is an entirely fair point. What I do see happening eventually is ray tracing being used like on the consoles. For a select couple of simple things instead of for big systems. That way the heavy stuff can be done like normal but the otherwise minute or hard to do things with shaders can be done on the ray tracing part of the gpu (like in spiderman window reflections) thus leveraging the best of the 2 systems.

That's how most RTX games are. And i agree that there are some effects that are barely different to rasterization. I think Control is the only game so far that completely runs off ONLY ray-tracing when it's enabled. Every other game only uses ray-tracing for specific parts of the lighting. (afaik)

Dislaimer: If my post is older than 5 minutes, refresh the page. Most of my posts get edited straight away. 😄

 

PSU Tier List          AMD Motherboard Tier List          SSD Tier List

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK, HW Labs & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: ASUS ROG Swift PG35VQ - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915 - Headphones: Beyerdynamic Amiron Home - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark + CinebenchR23 running for 1 hour. Fans @850RPM. Pump @1600RPM.

Water: 37°C

CPU: 73°C

GPU: 54°C

Link to post
Share on other sites
Just now, Stahlmann said:

They've become a lot more "just for fun" than doing any serious reviewing imo. Their target audience is basically the people who are just getting into the industry. Anyone who has been here for some time will have likely already made their move to more serious reviewers like Gamers Nexus and Hardware Unboxed for "real" and "technical" reviews.

But if that's the case, then surely they should put more of an emphasis on legitimate consumer advice.

 

It's all well and good saying it's a cheaper 3090, without the bits you don't need (the extra VRAM), but when a 3080 is nigh on half the price of the 3080Ti and delivers most of the performance, surely they should say that's the better option!

Laptop:

Spoiler

HP OMEN 15 - Intel Core i7 9750H, 16GB DDR4, 512GB NVMe SSD, Nvidia RTX 2060, 15.6" 1080p 144Hz IPS display

PC:

Spoiler

Vacancy - Looking for applicants, please send CV

Mac:

Spoiler

2009 Mac Pro 8 Core - 2 x Xeon E5520, 16GB DDR3 1333 ECC, 120GB SATA SSD, AMD Radeon 7850. Soon to be upgraded to 2 x 6 Core Xeons

Phones:

Spoiler

LG G6 - Platinum (The best colour of any phone, period)

LG G7 - Moroccan Blue

 

Link to post
Share on other sites
1 hour ago, Deus Voltage said:

Reviews are already circulating on Youtube. I am very much interested in hearing your opinions:

 

GamersNexus

 

  Reveal hidden contents

 

 

JayzTwoCents

 

  Reveal hidden contents

 

 

LinusTechTips

 

  Reveal hidden contents

 

 

HardwareUnboxed

 

  Reveal hidden contents

 

 

Personal Opinion:

 

I think Linus missed the mark on this one big time. I was almost tempted to think that the review was paid by Nvidia (as noted by Spotty@, the review was made a bit too early so I retract what I previously said for the time-being), but I wouldn't go that far as Linus has a pretty good track record as a tech reviewer and throwing around silly accusations hurts the conversation more than anything. I agree with Steve (GN) and the rest. It is overpriced and feels like Nvidia is tone deaf. What do you guys think?

How is it overpriced? What is price? Price is a bygone of the past era/s

 

 

Imho, it should've been priced a 100-159 bucks cheaper. It would've been much better.

Link to post
Share on other sites

Much like the 2080ti, 6900XT, and 3090 .. the 3080ti is an overpriced money grab targeting those without money sense or market forethought.

 

For those interested, here something i posted back in September last year about the price history of Nvidias top end cards. Add the 3080ti and see just how disgusting the latest generations have been price wise.

 

 

On 9/3/2020 at 2:01 PM, SolarNova said:

Historical initial MSRP of the top tier cards are as follows.

Excludes Dual GPU cards and OC'ed specials. (feel free to correct me on anything listed, just provide links as proof... cheers)

 

Launch Year ------- GPU ------------ Price ---- With Inflation

 

2000 --- GeForce 2 Ti --------------- $500 ---- $750

2001 --- GeForce 3 Ti500 -----------$350 ----$515

2002 --- GeForce 4 Ti4600 ---------$400-----$575

2003 --- GeForce FX 5950 Ultra ---$500-----$705

2004 --- GeForce 6800 Ultra -------$500 -----$685

2005 --- GeForce 7900 GTX -------$500 -----$665

2006 --- GeForce 8800 GTX -------$600 -----$770

2008 --- GeForce 9800 GTX+ -----$230 -----$275   (no this isnt a typo)

2009 --- GeForce GTX 285 --------$400 -----$485

2010 --- GeForce GTX 480 --------$500 -----$600

2011 --- GeForce GTX 580 --------$500 ---- $575

2012 --- GeForce GTX 680  -------$500 -----$565

2013 --- GeForce GTX 780 Ti -----$700 -----$775

2015 --- GeForce GTX 980 Ti ---- $650 -----$710

2017 --- GeForce GTX 1080 Ti ----$700 ----$740

2018 --- GeForce RTX 2080 Ti ---$1200 ---$1230

2020 --- GeForce RTX 3090 ------$1500 --- $1500

 

Spot the outliers.

 

Now we can argue and debate about the 3090 being a Titan or not, if it is then fine, but we should then accept that a 3080ti is to release in the future as the 3080 isnt close enough in performance to the 3090 (as per Nvidias release event graphs) to be the top tier gaming card. (all previous gen Titans had a gaming card that was near identical in performance).

As such this leaves Nvidia the chance to either do what they have done in the past, which is release the new x80ti at the same prices as the x80 and drop the x80 price.... or do what they seem more likely to do nowadays and increase the x80ti price to somewhere between the 3080 and 3090, likely $900-$1000.

 

~$700 (in todays currency) are not uncommon as we can see, but above that is unacceptable.

 

Hope this clears up the debate over price.

 


 

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w | VDU: Panasonic 42" Plasma |

GPU: Gigabyte 1080ti Gaming OC w/OC & Barrow Block | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + Samsung 850 Evo 256GB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P |

Link to post
Share on other sites

Not surprised that they put out new cards, they're just going on schedule. I think it makes more sense than not releasing them at all. If the chip shortage is remedied before they're ready to release their next line of cards, they'll have these to sell. It would look stupid to see supply normalize 6 months before the next launch, then launch the TI, then 6 months later launch your new cards. It's also not like they're not making cards. Loads of people are still getting them, there's just insane demand.

 

I personally can't wait for them to release their next line and for people to upgrade. The used pool is going to be fucking massive.

 

46 minutes ago, rockyroller said:

It comes dome to that meme again.

nVidia the way games are meant to be played

&

AMD the way games can be played

If you're implying you can get AMD cards and can't get Nvidia, you're just wrong?

43 minutes ago, jaslion said:

So as I expected literally a 3090 with less vram and a teeny tiny clock hit. All whilst if you take the average of games tested the 6900xt seems to be the better buy as raytracing still hasn't taken off yet in games that makes is a must have feature. For games that is.

There's other features that make their cards a little more desirable to some people, like anyone who does streaming, video calls, etc. Their feature adds are very strong.

14 minutes ago, Stahlmann said:

They've become a lot more "just for fun" than doing any serious reviewing imo. Their target audience is basically the people who are just getting into the industry. Anyone who has been here for some time will have likely already made their move to more serious reviewers like Gamers Nexus and Hardware Unboxed for "real" and "technical" reviews.

 

Best example is the latest MiniLED TV "review" where they got the samsung tv. They just basically sat down in front of it and said "this is fine". No measurements, etc... LTT is a lot about subjectivity these days.

They've been more on the entertainment side for years. I'd say they haven't really put out reviews that are that useful for 5+. They're more of a product overview, with a few graphs thrown in.

9 minutes ago, yolosnail said:

But if that's the case, then surely they should put more of an emphasis on legitimate consumer advice.

 

It's all well and good saying it's a cheaper 3090, without the bits you don't need (the extra VRAM), but when a 3080 is nigh on half the price of the 3080Ti and delivers most of the performance, surely they should say that's the better option!

Ti cards in the upper range have always been for people that want to eek out that extra little performance, and they don't care what the extra cost is. It's not a bang for your buck segment.

Current PC:

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to post
Share on other sites

I think reviews and opinions doesn't change change the fact that we cannot buy anything. I stated the obvious but still... I have almost lost interest in all of the above opinions

Link to post
Share on other sites
5 minutes ago, dizmo said:

Ti cards in the upper range have always been for people that want to eek out that extra little performance, and they don't care what the extra cost is. It's not a bang for your buck segment.

I disagree, that was the Titan, and now the 3090.

 

The *80Ti was the bang-for-the-buck of the high end, it was getting most of the top end performance for a lower price

Laptop:

Spoiler

HP OMEN 15 - Intel Core i7 9750H, 16GB DDR4, 512GB NVMe SSD, Nvidia RTX 2060, 15.6" 1080p 144Hz IPS display

PC:

Spoiler

Vacancy - Looking for applicants, please send CV

Mac:

Spoiler

2009 Mac Pro 8 Core - 2 x Xeon E5520, 16GB DDR3 1333 ECC, 120GB SATA SSD, AMD Radeon 7850. Soon to be upgraded to 2 x 6 Core Xeons

Phones:

Spoiler

LG G6 - Platinum (The best colour of any phone, period)

LG G7 - Moroccan Blue

 

Link to post
Share on other sites
4 minutes ago, SolarNova said:

Much like the 2080ti, 6900XT, and 3090 .. the 3080ti is an overpriced money grab targeting those without money sense or market forethought.

 

For those interested, here something i posted back in September last year about the price history of Nvidias top end cards. Add the 3080ti and see just how disgusting the latest generations have been price wise.

 

 

 


 

 

Oh hey yeah I saw this post and I've seen some video essay's about this (and many other products doing the exact same thing). The less for more cycle is super cruddy and I really dislike it.

 

I fully remember when going in the then 250-300$ bracket could give you a hell of a solid gpu that would last for years to come nowadays even at msrp you are basically getting at best a low to mid mid range gpu. It's really just trying to get as much money out of people whilst doing the least possible.

 

It is even worse when you compare them WITH inflation adjusted price and the average performance increase between cards and see that it has basically been a price goes up performance gain goes down with every release. The 1060 lineup being the oddball out in that graph. (well the whole 1000 series was the oddball out as we got 4 cards that went beyond the performance of the last gen highest end card for actual decent prices).

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×